This is kinda the crux of it. The flat reality is that in order to create an application that needs more power, generally, requires more resources to create. Hell, as we are all gamers, this is pretty obvious. Games with higher system requirements, visual fidelity, and all those fancy buzzwords just flat out cost more*. So as the cost to develop bleeding edge increases, the incentives to strike at that decrease. It's why a lot of things don't utilize multi core performance.
So adding more speed and power to CPUs will, generally, increase the cost to push programs that max them out, which in turn creates less incentives to be there, which means less demand for higher power CPUs, and the result is the mass realization that power is generally 'good enough' that development efforts are better spent elsewhere, with some going here.
And, personally, I can't fault them for focusing on power consumption, heat dissipation, and making top line cards more mobile friendly. It makes sense, and is a net win. It's not for nothing that I love my Surface Pro tablet, after all.
*obvious caveats can apply. So don't go quoting Witcher 3 at me people!