Discussion about this post

User's avatar
Rob Carpenter's avatar

I appreciate the thoughtful analysis. My concern is that it assumes AI will progress along a historical average trend line. I’m not convinced that’s the right model.

AI feels more like it’s on an intelligence J-curve. For a long time, progress looks incremental, even overhyped relative to impact. Then capability compounds. Once systems begin meaningfully improving reasoning, autonomy, and self-directed learning, historical averages stop being useful predictors. They understate what happens when feedback loops kick in.

Leaders like Sam Altman and Demis Hassabis have publicly projected timelines for AGI in the 2028–2030 range. They could be wrong. But if they’re even directionally correct, we’re not talking about marginal productivity gains—we’re talking about a structural shift in how cognitive labor is performed.

And that’s the key point: once (and if) we reach AGI, I don’t think “decision-based” knowledge work remains protected territory. Strategy, analysis, forecasting, optimization—these are ultimately reasoning tasks. If general reasoning becomes automatable at scale, the boundary between “assistive AI” and “replacement AI” blurs quickly.

History is useful—but if we’re entering a regime change, historical averages may dramatically understate what’s coming.

Andrii Buvailo, PhD's avatar

Thanks for mentioning! It is the power and community vibe of Substack that is hard to beat anywhere else, honestly, just a random comment can lead to an article mention. To the point of the article, I had a chat this morning on this exact topic with a guy who is a senior software developer at a huge messenger company (won't disclose it, but you almost certainly know the company). He is a highly paid, highly skilled professional, and he is really worried about the job market. What he said is that the latest AI models help him write code and do other tasks like 80-85%, and they can lets say, react to Jira tickets, take assignment and do the work on certain tasks with up 98% precision (obviously percentages are purely out of his head, but you get the point). And what is crazier, he said even last spring of 2025, he would struggle with AI to help him with his work, he would "spend more time struggling with it than really getting value." Things changed sharply just lately... So the rate of progress is really skyrocketing, and it is hard to predict what will happen soon... I am asking this same question to almost all IT folks I meet, and the rate of usage is quite varying degree, though. While this guy is obviously actively using AI, some of my other peers do not seem to be that impressed yet... just random observation.

13 more comments...

No posts

Ready for more?