Page MenuHomeSoftware Heritage

Kafka throws flush timeout error
Closed, MigratedEdits Locked

Description

I tried using sudo docker-compose exec swh-loader swh loader run nixguix "https://guix.gnu.org/sources.json" on my self hosted swh instance and got the following error:

ERROR:swh.loader.package.loader:Failed to initialize origin_visit for https://guix.gnu.org/sources.json
Traceback (most recent call last):
  File "/srv/softwareheritage/venv/lib/python3.7/site-packages/swh/loader/package/loader.py", line 389, in load
    self.storage.origin_add([origin])
  File "/srv/softwareheritage/venv/lib/python3.7/site-packages/swh/core/api/__init__.py", line 181, in meth_
    return self.post(meth._endpoint_path, post_data)
  File "/srv/softwareheritage/venv/lib/python3.7/site-packages/swh/core/api/__init__.py", line 278, in post
    return self._decode_response(response)
  File "/srv/softwareheritage/venv/lib/python3.7/site-packages/swh/core/api/__init__.py", line 354, in _decode_response
    self.raise_for_status(response)
  File "/srv/softwareheritage/venv/lib/python3.7/site-packages/swh/storage/api/client.py", line 29, in raise_for_status
    super().raise_for_status(response)
  File "/srv/softwareheritage/venv/lib/python3.7/site-packages/swh/core/api/__init__.py", line 344, in raise_for_status
    raise exception from None
swh.core.api.RemoteException: <RemoteException 500 KafkaDeliveryError: ['flush() exceeded timeout (120s)', [['origin', {'url': 'https://guix.gnu.org/sources.json'}, 'No delivery before flush() timeout', 'SWH_FLUSH_TIMEOUT']]]>
{'status': 'failed'}

Note that the error is based on Kafka.

Task creation suggested by @vlorentz in T2687#64550

Event Timeline

KShivendu created this object in space S1 Public.
KShivendu added projects: Journal, Core Loader.
vlorentz edited projects, added Docker environment; removed Core Loader.

If you face this issue, try restarting the containers using docker-compose down and docker-compose up.

Also, a container healthcheck script can be written in swh-journal to automatically detect such issues with Kafka.