LLMs would already be close to AGI if they could remember larger contexts
@special-boy Elon Masik
@special-boy @matrix my understanding, possibly wrong, is that the HBM memory can't just be "increased" without all kinds of issues, possibly having to do with fanout or timing.
With The way they fuck normal people when it comes to that I don't want to imagine how much profit they make per GB on ai cards and systems for businesses.