Lemmy newb here, not sure if this is right for this /c.
An article I found from someone who hosts their own website and micro-social network, and their experience with web-scraping robots who refuse to respect robots.txt, and how they deal with them.
Are there any guides to using it with reverse proxies like traefik? I’ve been wanting to try it out but haven’t had time to do the research yet.
https://github.com/TecharoHQ/anubis/issues/92
They seem to be working on a traefik middleware, but in the meantime there is a guide to set it up manually with traefik.