

yes i did read OP.
ed. i see this was downvoted without a response. But il put this out there anyway.
If you host a public site, which you expect anyone can access, there is very little you can do to exclude an AI scraper specifically.
Hosting your own site for personal use? IP blocks etc will prevent scraping.
But how do you identify legitimate users from scrapers? Its very difficult.
They will use your traffic up either way. Dont want that? You could waste their time (tarpit), or take your hosting away from public access.
Downvoter. Whats your alternative?
Havent used forgejo, but -
Is the zip file a dump of the filesystem / data / config the application needs?
You could want to extract that somewhere, and mount the directory into the container.
https://docs.podman.io/en/latest/markdown/podman-volume-mount.1.html
so for example you extract to
/home/forgejo
you would map this to a container in podmanStack to help? https://stackoverflow.com/questions/69298356/how-to-mount-a-volume-from-a-local-machine-on-podman/71576242#71576242