Lemmy newb here, not sure if this is right for this /c.
An article I found from someone who hosts their own website and micro-social network, and their experience with web-scraping robots who refuse to respect robots.txt, and how they deal with them.
https://github.com/TecharoHQ/anubis/issues/92
They seem to be working on a traefik middleware, but in the meantime there is a guide to set it up manually with traefik.