Gonna pre-register this take: There’s a good chance that the explosive rate of LLM progress over the past few years is about to hit a ceiling and move into a more gradual slope of progress. I still expect LLMs to keep progressing/potentially reach AGI over the long term.
Mechanism is mostly that there aren’t many OOMs of cost-scaling left. We’ve had explosive progress because willingness-to-spend increased, but that’s gonna hit soft and hard caps very very soon. Lots of uncertainty re: running out of training data but that might be a bottleneck.