I am easily a foot taller than Andy Grove. But whenever I was with him, I felt that he was the giant.
That's what the bestselling Harvard business professor, Clayton Christensen, wrote about the former Intel chief executive when he passed away in 2017. Christensen, who coined the term "disruptive technology", said he would most miss Grove's ability to understand how a complex organisation works, and to wield it to Intel's advantage.
It allowed Grove, who started at the company the day it was incorporated on July 18, 1968, to famously re-orient the business in the 1980s. Intel shifted away from memory chips for mainframe computers towards the microprocessor – the engine that spurs into motion when you turn on your computer.
Propelled by a deal with IBM to put Intel processors into all its personal computers, the company came to provide Silicon Valley with one of its most essential technologies. Intel Inside and the accompanying jingle became one of the most memorable advertising slogans of the modern era.
Even after five decades of dominance, no other company in the world can produce a better and faster microprocessor. Intel is at the pinnacle of an industry that manages to engineer miracles like no other. We tend to perceive innovation as something uncertain, particularly where it's so reliant on scientists to drive it forward. Yet Intel is anything but ambiguous. It has released successive advances in processor engineering like clockwork.
In 1965, future co-founder Gordon Moore made a bold prediction about the exponential growth of computing power. He predicted that the number of microchip transistors etched into a fixed area of a computer microprocessor would double every two years – and so, therefore, would computing power. Intel has since delivered on this improbable promise, immortalising "Moore's law".
It's difficult for anyone to fathom the effects of exponential growth. But it is why a single iPhone today possesses many times more computing power than the entire spacecraft for the NASA Apollo moon mission of 1969. Without Moore's law, there would be no Google, no Facebook, no Uber, no Airbnb. Silicon Valley would be like any other valley.
The big miss
And yet, the iPhone is also what Intel missed. Immediately after the company won Apple's Mac business in 2005, Steve Jobs came asking for another chip for his smartphone. Intel certainly wanted to dominate this emerging sector but the price Jobs was offering was below its forecasted cost and it misjudged the size of the iPhone market. The company passed.
Apple had no choice but to build its own chipsets by licensing technologies from ARM, a British-based company controlled by Japanese interests. If Apple and its iPhone had been the only competitors, Intel might have been able to gradually adapt. But Google came in soon after with Android, a free operating system that Samsung, Huawei and HTC all adopted. Qualcomm, Nvidia, and Texas Instruments, all licensed by ARM, became the phone makers' go-to suppliers for energy-efficient, low-cost computing devices.
These American rivals are not trying to beat Intel. Qualcomm specialises in mobile phones and Nvidia specialises in graphics in video games. They all outsource production to third parties in Asia. But an Intel microprocessor sells for around US$100 while ARM-based chips sell for around US$10, and often less than a dollar. That's how ARM-based designs are now found in more than 95% of the world's smartphones.
In other words, Intel failed to compete in smartphones against those who have far less resources. It's a great irony when you reflect that Grove once invited Christensen to the Intel HQ in Santa Clara, California, to explain his theory on disruption. Grove later credited the meeting as the main driver for Intel's decision to launch the Celeron chip in 1998, a cheap product aimed at low-end PCs, which within a year captured 35% of the market.
The new goldrush
Now the big question is whether Intel is repeating its previous mistake with iPhones – this time in driverless cars. Last March it purchased Mobileye, an Israeli company that makes digital vision technology, for US$15.3 billion. It was a big bet in a sector that has huge potential: as autonomous driving takes off, vehicles are becoming computers on wheels. They will require more and more microchips and Intel hopes to dominate.
Except for one glitch. Everything Intel has done in the last 50 years is geared towards general purpose, high-end chipsets. Its integrated model – where the company designs and manufactures its processors – means it absorbs an enormous amount of fixed cost, in research and design as well as manufacturing.
The only way to offset these burdens is to sell a high volume of devices at high margins. The result is that the company is obsessed with technological progress, but has a rigid business model which limits what it can and cannot do. There's a monster inside Intel with a ferocious appetite.
But what if autonomous driving doesn't actually require the computing power Intel is counting on? This is the competing vision of Huawei. When I recently visited Shenzhen, executives from the Chinese telecom giant explained to me that much of the city's infrastructure will be digitalised and that Huawei will saturate it with a 5G network. This will drastically reduce any speed and latency problems for computers.
This means the computing inside cars can be mostly offloaded to the city's infrastructure. It is a radical vision, but clearly a viable alternative. The implication is that a BMW or Toyota doesn't need that many high-end chipsets after all. It's smartphones all over again.
But in each case, their business model and the demands of existing shareholders formed an intractable nexus that even the most courageous executives found impossible to navigate. Grove once said, "only the paranoid survive". Maybe he was right.