The great thing about AI, it is finally a killer feature that is enjoyed and useful by every user worldwide. And the tech industry finally have an excuse to up sell 16GB as baseline, and perhaps even try to push 24 or 32GB Memory, along with GPU / NPU and CPU upgrade.
For users, a few hundred dollar extra ( on top of the original purchase ) is a such a small number compared to the productivity gain over the usage span of the computer.
AI alone not only increased the server hardware requirement but also user client requirement. It is basically the question everyone has been asking, what is after Smartphone? And to a degree it is AI. ( or LLM )
This will easily push the whole Semi-Conductor Industry forward all the way to 2032 ~ 2035. We will be at 8A or 6A by then.
PCIe 7? possibly PCIe 8? WiFi 9 which is a fixed version of WiFi 8. There are so many great Hardware improvement coming out all because of the demand of greater computing usage.
Software side has been rather boring TBH. I really like the phase Allan Kay uses to describe modern days software are "reinventing the flat tire".
As interesting as this breakdown of the current state of things is, it doesn’t tell us much we didn’t know or predict much about the future, and that’s the thing I most wanted to hear from an expert article on the subject, even if we can take it with a large pinch of salt.
>software is getting slower more rapidly than hardware is becoming faster.
>Wirth attributed the saying to Martin Reiser, who in the preface to his book on the Oberon System wrote: "The hope is that the progress in hardware will cure all software ills. However, a critical observer may observe that software manages to outgrow hardware in size and sluggishness."
I wish that there would be more instances of developments like to Mac OS X 10.6, where rather than new features, the software was simply optimized for a given CPU architecture, and the focus was on improving performance.
We haven't been bound by moore's law because we just waste computing power because programmers are expensive. No one is trying to optimize nowadays except in very niche places. And when push comes to shove we just start wasting slightly less. Like adding a JIT to a script language 15 years too late.
Given the power and ubiquity of smart phones, most people don't need any other computer in their personal life. What can be done locally on a smart phone seems like it will be more constrained by battery life and physical size than anything else, and there will continue to be a mix of things that can run on-device and other more compute-hungry functions run in the cloud. I don't see smartphones being replaced or augmented by other devices like smart glasses - people want one device that does it all, and not one they wear on their face.
The same is somewhat true for business use too, especially if compute-heavy AI use becomes more widespread - some functions local and the heavy work done in AI-ready datacenters. I'm mildly surprised that there hasn't already been a greater shift away from local compute to things like Chromebooks (ubiquitous in schools) since it has so many benefits (cost, ease of management, reduced need to upgrade), but maybe it will still come if the need to rely on datacenter compute increases.
Even if we imagine futuristic power-sipping neuromorphic chips, I don't see that changing things very much other that increasing the scope of what can be done locally on a power budget.
For users, a few hundred dollar extra ( on top of the original purchase ) is a such a small number compared to the productivity gain over the usage span of the computer.
AI alone not only increased the server hardware requirement but also user client requirement. It is basically the question everyone has been asking, what is after Smartphone? And to a degree it is AI. ( or LLM )
This will easily push the whole Semi-Conductor Industry forward all the way to 2032 ~ 2035. We will be at 8A or 6A by then.
PCIe 7? possibly PCIe 8? WiFi 9 which is a fixed version of WiFi 8. There are so many great Hardware improvement coming out all because of the demand of greater computing usage.
Software side has been rather boring TBH. I really like the phase Allan Kay uses to describe modern days software are "reinventing the flat tire".