@matrix Read an article last week, and some guys were claiming we're maybe 2 years away from AGI.
On one hand I doubt we're THAT close, on another it shouldn't take a genius to realize that the pattern seeking approach that LLMs use is what human brains do too, so maybe we're closer than people realize. I wouldn't be surprised if it took years to realize we have an AGI on our hands after the AGI was created.
@alyx It's possible that LLMs are soon going to be refined so much that they 100% short queries, but all of them start shitting the bed with growing context and I doubt that's going to change anytime soon.