Page MenuHomeSoftware Heritage

docker: allow kafka to be consumed from the host
ClosedPublic

Authored by douardda on Sep 13 2021, 4:51 PM.

Details

Summary

using a config like:

journal:

brokers:
  - 127.0.0.1:29002

shoud allow to execute swh commands like swh storage replay or `swh
dataset export` from the host, consuming kafka topics from the
docker-compose.

Also configure kafka topics cleanup policy to 'compact'

instead of the default 'delete', so the user can keep its data in the
test kafka server for more than a week.

Diff Detail

Repository
rDENV Development environment
Lint
Automatic diff as part of commit; lint not applicable.
Unit
Automatic diff as part of commit; unit tests not applicable.

Event Timeline

Add a bit of documentation in the README file on how to consume kafka from the host

This revision is now accepted and ready to land.Sep 14 2021, 11:03 AM
vlorentz added inline comments.
docker/README.rst
188

why? (I see below that the port was not actually 5092; but why not set it to 5092 in docker-compose?)

docker/README.rst
188

because it's what I had in my docker-compose when I pushed this diff :-) and 29092 is a palindrome, which is thus much better :-)

But ok, I see, 5092 kinda make sense with other exposed ports.

Keep port 5092 exposed on host