• 0 Posts
  • 1 Comment
Joined 2Y ago
cake
Cake day: Jul 03, 2023

help-circle
rss

GPT3 is 800GB while the entirety of the English Wikipedia is around 10GB compressed. So yeah it doesn’t store evey detail of everything but LLMs do memorize a lot of things verbatim. Also see https://bair.berkeley.edu/blog/2020/12/20/lmmem/