IDC’s PC sales numbers show a dramatic fall, but they’re not the whole story.
Before we blame one thing we need to take a much more nuanced view and look at the last decade of the IT world. After all, nothing is ever as simple as it seems.
We’re at an interesting inflexion point in the IT industry where innovation is moving away from desktop PC hardware into software and into the server and up to the cloud.
The truth is quite simple: PCs are lasting longer, they’re not getting measurably faster, and software is getting better. Why do you need to buy a new PC when you can get better performance with a software upgrade on your old hardware?
If I was to put a finger on the point where everything changed, where Windows stopped being the driver for PC sales, I’d have to point at Windows Vista.
That was the point where Microsoft and the PC OEMs stopped trusting each other. Microsoft made a bet on PC hardware and capabilities, and the PC industry pulled the rug out from under it, forcing the mess that was Vista Basic on users as they tried to sell cheap PCs with old graphics hardware.
That meant Microsoft had to change. It couldn’t make that same bet on hardware anymore. It didn’t trust OEMs to deliver on the promises the silicon vendors were making (and if we look at the initial Windows 8 hardware, it’s pretty clear it was right to make that decision). So it made the software better instead.
New releases of Windows would need fewer resources, offer better performance, and (particularly important to mobile users) use less power.
So we shouldn’t have been surprised when Windows 7 came along, bringing all that better performance on the same hardware. There wasn’t a reason to buy a new PC for a new Windows any more.
The death of Windows XP: It’s your permission to go Chromebook, tablet, Linux, whatever
We could just buy a cheap upgrade and get more life from our PCs. My Vista-era desktop systems got a performance bump because the software got better, taking advantage of the older hardware. I didn’t need new PCs, I didn’t even need a new graphics card.
I only bought my current PC last year because a hardware failure fried the Vista machine’s motherboard. If I hadn’t had a hardware failure I suspect I’d still be using that PC today.
The new machine has the same hard disks, even the same graphics card, using the same multi-monitor setup as that original Vista-era machine. It wasn’t any faster, but it got another performance bump when I upgraded it to Windows 8 last summer. We even saw significant improvements on XP-era test hardware.
So yes, that means Windows 8 is one thing that’s to blame for a slow-down in PC sales. You don’t need a new PC to see a benefit from it, especially when you’re getting a 10 percent speed bump over Windows 7 running on Vista-era hardware, and an extra hour or so battery life on a three year old laptop.
A cheap upgrade download and your old PC gets a new lease of life. Why do you need to spend several hundred pounds or dollars for extra performance when it comes with an operating system upgrade for a fraction of the cost?
So if our software gets better on older hardware, so what about all that new hardware?
First we need to look at the trends that drive the PC industry. Like all consumer industries it has to respond to customer needs, and those customer demands have changed over the last couple of decades; changes that are having a significant impact on more than just the PC market.
Fed up with planned obsolescence, we now demand things that last. How long did you keep your last washing machine, your vacuum cleaner, your last car?
Devices may not be user serviceable, but they just don’t break the way they used to. Our dishwasher has moved house with us more than once, as has our washing machine. My car is thirteen years old, and still gets great mileage. Why would I need to change them?
The fact that today’s software gets better performance out of yesterday’s hardware can’t be ignored. It’s changing more than the PC industry – just look at Ford’s Sync strategy for in-car entertainment.
Why rely on fixed car hardware that’ll be with the driver for most of a decade, if you can have an API and an app ecosystem? Each time Pandora upgrades on my phone I get an improved experience, and Ford hasn’t had to change my car.
That trend accounts for one aspect of the longevity of PCs. They’ve stopped breaking, because we don’t want PCs that break. But there’s another aspect, the Moore’s Law elephant in the room.
A few years back PCs stopped getting faster. They just got more cores. As transistor density increased, the faster processors got, the hotter they got. And the hotter they got, the greater the risk of quantum instabilities in the billions of transistors that new processes were capable of making.
If you couldn’t go faster, then how could you scale the processor? The answer was obvious: more transistors per processor meant more cores per processor. Instead of a single core handling everything, the same area of silicon could offer two, four, even eight.
Applications would have to take advantage of those cores, changing their single threaded architectures into multi-threaded, able to take full advantage of the parallel processing capabilities of those new processors. The megahertz race was over, now we’d reap the benefits in new ways.
Sadly that never happened. Development tools and languages still focused on the same single threaded, procedural approaches to application development.
Applications stopped getting faster, stopped getting better with each new tick and tock of the Intel processor cadence. We could run more of them, but how many copies of Word do we actually use at a time?
If we want to sell new desktop PCs we need new desktop applications that take advantage of that hardware. While Intel has visions of deep virtualisation as a solution, much of what can be easily parallelised is best done using the tools and techniques built into GPUs (and accessed via APIs like OpenCL and DirectX Compute).
Until we get a new generation of desktop applications that uses all those cores efficiently, there won’t be new PCs on people’s desks or in their bags.
If desktop software hasn’t taken advantage of those new processors, the cloud certainly has. The high density datacenters that power the cloud, with their containers of commodity servers take advantage of virtualisation to operate in ways the home PC can’t.
Arrays of throughput servers provide scale out capabilities, moving compute from the PC to the cloud. That means the home PC is being relegated to a service endpoint – a future akin to that of the rumoured “always-on” Xbox.
Windows 8 gets some of the blame here too. As I noted in my last post, the WinRT APIs at the heart of Windows 8 make it a lot easier for developers to write applications that offload functionality to the cloud.
HPs’ Moonshot announcement is indicative of this trend. Instead of innovating in the home and office PC, it’s offering cloud providers access to hardware innovation at cloud speeds. It’s already demonstrated developing new processor boards for the Moonshot backplane in under three months – and with the initial Atom-based low power servers only the start, it’s clear that this is where HP sees the future of computing.
And so we approach a cusp. The way we both use and buy computing is changing, dramatically. Our home and office PCs get better, but only when software gets better – and that means we’re buying new hardware less often.
So yes, in a way this collapse is Windows 8’s fault. With better performance on older hardware that lasts longer, and with tools that make it easier to work with the cloud, there’s little or no need to buy a new PC – at least not until something fails in your current PC, or until there’s a compelling new hardware feature that makes your life easier.
If the industry wants to sell more hardware, then it needs to encourage developers to produce software that takes advantage of its capabilities. Until then, well, what we have now is good enough to meet our needs.
So welcome then to the era of “good enough” computing. The hardware we have doesn’t need to be any better for the software and tools we use. That’s been the real big story of the last decade, and one that’s conveniently left out of the narrative.
We’re at a plateau, of computational power, of software, and of developer tools. The exponential explosion of desktop computing capabilities of the last thirty years is over, and it’s unlikely to come back. And that’s the real reason why PC sales have plummeted.