This is very true, but for scientific use size doesn't always matter as much as it does for consumer use. TBH, I think we'll eventually move to a hybrid of the two states I mentioned, where you can have a crazy powerful computer in your pocket, but if you need even more insane power, you connect to a remote system that you either own or rent time on (kinda coming full circle, in a way). For pretty much everything except for graphics this is stupid easy to do even now (especially with Linux), but integrating it into consumer devices and making it easy for the proverbial grandmother to use will be the biggest hurdle, imo.
I know I'll probably look back on this in a few years time and laugh at myself, but I really think we're approaching a different wall in the consumer market, where the amount of processing power required by the vast majority of people is far outstripped by the power available with hardware of the time. I think we're mostly there already with the desktop market, except for gamers. Even PC graphics, I think, will hit a wall in the next 5-10 years, where you can't really improve the graphics quality anymore, and we can live-render true photorealistic video on our desktops. The problem will be with mobile devices, but I think with the recent improvements in the ARM architecture, and if we can integrate a good, standardized, remote processing architecture, that could easily hit this wall as well.
Of course, scientific, enterprise, and a few other computing markets will continue to require more and more power, but for the other 80-90% of the market, I really think we'll reach a point where it doesn't matter.