

1·
2 years agoGPT3 is 800GB while the entirety of the English Wikipedia is around 10GB compressed. So yeah it doesn’t store evey detail of everything but LLMs do memorize a lot of things verbatim. Also see https://bair.berkeley.edu/blog/2020/12/20/lmmem/
PHP 8 makes it possible to rescue the princess but your 83 legacy princesses are all still PHP 5.