Although Apple’s announcement last week that it was transitioning the Mac away from Intel chips and onto its own “Apple Silicon” was expected, it’s still worthwhile to understand how Apple got to a point where it was willing and seemingly quite eager to make such a risky transition. Make no mistake about it, this is risky—switching underlying architecture, whether hardware or software, is a fraught process, one that few companies have dared to do.
“Today is the day we are announcing that the Mac is transitioning to our own Apple Silicon,” Apple CEO Tim Cook said, announcing the change.
Cook with Apple Silicon (Source: Apple)
“When we look ahead, we envision some amazing new products, and transitioning to our own custom silicon is what will enable us to bring them life,” Cook said. “At Apple, integrating hardware and software is fundamental to everything we do. That’s what makes our products so great. And silicon is at the heart of our hardware, so having a world class silicon design team is a game changer.”
The bulk of the computing world has chosen a much simpler path of more evolutionary upgrades with an eye on compatibility. Other than Apple, the rest of the desktop and laptop world pretty much still runs on legacy improvements to the Intel x86 architecture and Microsoft operating system the original IBM PC ran in 1981. The path to Windows involved building on top of, and then incorporating, DOS, rather than replacing it; the most successful versions of Windows have been the ones that have been the most familiar and the most stable. (The alternative would be the short-lived Windows 8.) When the time came to move to 64-bit processors, it was the extension of the x86 architecture that succeeded (pioneered by AMD) not a wholesale change. On most PCs in the market, you can still open a DOS box and run the version of VisiCalc designed for the IBM PC in 1981.
You give that up at a tremendous risk. As Cook pointed out, Apple has taken on this risk three times in the past—in the transitions to PowerPC, OS X, and to Intel. So it’s instructive to look at each of these transitions and see what went right—and what went wrong.
Moving to PowerPC
The first transition was from Motorola CPUs, which powered the initial Macintoshes, to PowerPC in the early 1990s. Then, as now, Intel’s chip dominated the PC landscape, so much so that Motorola was having trouble making a go of it with its own 68000-based chips. Meanwhile, IBM was upset at the idea that there were so many “Wintel” clones running its software. So IBM, which then as now had its Power line of processors and Motorola joined with Apple, to create the Apple–IBM–Motorola alliance, known as AIM, in 1991. This led to the creation of the PowerPC, which first shipped in 1994. The idea was that this would outperform Intel. That was always arguable, although you could find benchmarks on both sides for a while.
But it became harder and harder for AIM to keep up with Intel’s manufacturing process. Since IBM never had much success with the PowerPC in the mainstream market (although to this day they still make Power chips for higher-end servers), the costs of designing and manufacturing the chip had to be split out among a much smaller volume than Intel had. The result was a set of machines that were more expensive but less powerful than equivalent Intel machines—to the point that it nearly killed Apple.
When Steve Jobs announced that Apple was moving to Intel in 2005, everyone acknowledged it was time. Today, the PowerPC legacy remains in some IBM processors and in some embedded processors from Freescale (a chip company that spun out from Motorola).
Creating OS X
The second transition was from Mac OS to OS X (which was more recently renamed macOS again).
The original Mac OS essentially lasted from the introduction of the Macintosh in 1984, up through Mac OS 9 in 1999. But it wasn’t quite that simple. After OS 5, it was clear that Apple needed something more modern, much the way that the original DOS operating system was eventually supplanted by the Windows NT core. Apple started work on an OS named Pink, which was absorbed, in 1992, into another Apple/IBM joint venture called Taligent, later joined by HP.
That was a bigger failure, which fell apart as no one could really agree what the operating system should look like. (Eventually, it became the basis for IBM’s Workplace OS, which never even got a mainstream introduction.) When that failed, Apple updated its existing Mac OS (eventually System 7) and started work on another ambitious OS effort, known as Copeland, which also never got to market.
Indeed, in the mid-1990s, Apple was looking at a variety of options, including the well-regarded multimedia-focused BeOS, before deciding to buy NeXT in 1997 in the deal that brought Steve Jobs back to Apple. Next had created a machine and more important, an operating system called NeXT Step, based on the Mach kernel and an implementation of Unix. This eventually included an object-oriented framework based on Objective C.
Apple’s original plan was to develop a new OS to run in addition to Mac OS. But given the false starts Apple went through with its plans for a new OS, many developers were skeptical. So after Jobs became CEO again, Apple decided to combine elements of Mac OS and the Next OS, in part by using a tool called Carbon to make it easy to make Mac OS applications run on the new OS. The result of this was OS X in 2001, a Unix-based OS that still could run old Mac OS applications (at least as long as Apple was still using the PowerPC).
OS X has been upgraded since then and renamed again as macOS. It stayed on “version 10” for a long time, and the basic design elements have remained very stable, even as the OS has added new features. With some significant design changes, macOS “Big Sur,” also announced last week, is the first version to be labelled as “version 11.”
All the fits and starts to get to OS X are indicative of the risks such a major change enables. Apple’s failures with Pink, Taligent, and Copeland cost it a lot of developer support. But in the end, it was worth it—it gave the Macintosh a modern OS with the differentiation Apple would use to continue to charge premium prices.
Apple’s Intel Transition
Until last week, the most recent change was the move from PowerPC to Intel, which was announced in June 2005, and completed by the end of 2006. (We covered it here. )
At the time, the Power PC alliance was in trouble. While the alliance with IBM was still developing very competitive desktops chips such as the PowerPC G5, it wasn’t delivering competitive chips for notebooks, which were becoming a bigger part of the market.
At the time, Jobs said, “Apple just didn’t know how to build the amazing computers we want to deliver in the future with the PowerPC,” and he specifically talked about both raw performance and better power efficiency, in other words, performance per watt—which was necessary to create thinner, smaller laptops. Apple said it had actually been working on the transition ever since it finished the creation of OS X.
One of the technologies that Apple included with OS X for years after the transition was called Rosetta, which would translate PowerPC applications into Intel ones. Apple kept this as a part of the OS up through the “Lion” version of OS X in 2011.
The first Intel-based Macintoshes were released in January 2006, and by the end of the year, all of the models in the family had been updated with Intel-based versions. Apple continued to support PowerPC in upgrades to OS X until it released the “Snow Leopard” version in 2009. Intel processors have generally served Apple well for 15 years.
The Apple Silicon Transition
So why change now?
In some ways, I think you can trace this back to Intel’s decision not to provide the chip for the iPhone and Apple’s subsequent decision to make its own chips. And then there’s the simple fact that Intel still makes most of its chips on a 14nm process, while chip foundry TSMC, which Apple uses, is now making chips at 7nm and moving to 5nm chips later this year. (To be fair, Intel does have some 10nm production, which is roughly equivalent to TSMC’s 7nm, but it’s pretty amazing that when Intel introduced its first 14nm chips in 2014, it was almost two years ahead of TSMC.)
Johny Srouji (Source: Apple)
At the announcement, Apple SVP for Hardware Technologies Johny Srouji said that for ten years, Apple has been building a “scalable architecture that is custom designed for Apple products,” with a focus on performance per watt. Over the past 10 years, he said, CPU performance has improved by over 100 times (going from the A4 in 2010 to the A13), while on the iPad, graphics performance has improved 1000 times. Including iPhones, iPads, and the Apple Watch, he said, Apple has shipped over two billion SoCs (system-on-chip, effectively modern processors).
That gives Apple the scale to produce unique processors cost-effectively—something that wasn’t true in the PowerPC era. Also, Apple is producing these chips at TSMC, on cutting-edge processes which are arguably a couple of years ahead of where Intel is. How times have changed.
SoC Features (Source: Apple)
Srouji said Apple is developing a family of SoCs specifically for the Mac: “Our plan is to give the Mac a much higher level of performance while at the same time consuming less power.” That’s enough reason, he said, but then discussed how Apple’s scalable architecture also includes other things such as advanced power management, a secure enclave (for privacy and security), a high-performance GPU, and a neural engine for machine learning, and an image processing engine. (I was interested that at no point in the presentation did Apple mention ARM, even though its processors take advantage of an ARM architectural license). But the key advantage Apple has, he said, is “the tight integration of our silicon with our software.”
Of course, we won’t really know how well any of this works until final software is shipping and we can really test it. (In the meantime, PCMag ran some benchmarks comparing Intel to Apple’s current processors and got some pretty decent results.)
Developer Tools for Transition (Source: Apple)
On the software front, Apple’s Senior Vice President of Software Engineering Craig Federighi said the technologies built into the new Big Sur version of macOS “will make the transition to Apple silicon smooth and seamless for both consumers and developers. He said that most of the developers who use Apple’s Xcode libraries will be able to get their code up and running “in a matter of days.” They can then distribute this code with Universal 2, which lets them create a single application (binary) that supports both Intel and the new processors.
Federighi said all of Apple’s applications, including the Pro applications, will be native for Apple silicon, and that Microsoft and Adobe are both well along the way to porting their applications. Demos included Microsoft Word, Excel, and PowerPoint, Adobe Lightroom and Photoshop, and Apple’s Final Cut Pro, all running on a developer system based on the A12Z chip used in the current iPad Pro.
For developers who don’t produce native applications as soon as Apple releases systems, Apple has Rosetta 2 designed to translate existing apps on install or even dynamically for software that uses just-in-time compilers such as Java. Apple showed Maya and a version of Tomb Raider both running under the new Rosetta.
It will also support virtualization, letting you run other operating systems on top of macOS Big Sur (a feature used mainly by software developers), but not BootCamp, an Apple program that lets you boot a Macintosh with Windows. (It’s unclear how you would get Windows anyway, as Microsoft only licenses Windows for ARM to system makers, not individuals). Still, third-party developers such as Parallels are working on alternatives.
Because it is running a variant of the silicon used in the iPhone and iPad, new Macs should be able to run all of those applications. (I wonder if this will finally get Apple to support touch screens on MacBooks, an area where they have long lagged their Windows counterparts.)
Mac OS Developer Kit OS (Source: Apple)
Federighi announced that Apple has started a quick start program for developers, including a developer transition kit machine, which uses a Mac mini enclosure with an Apple A12Z SoC, 16GB of memory, a 512GB SSD, and the macOS Big Sur developer beta and Xcode tools, all available now. Cook said the first consumer systems using Apple silicon should be out by the end of the year, and said a transition to all Apple silicon products should take about two years, though Apple will still come out with new Intel machines in the meantime and support macOS on Intel “for years to come.”
“Our vision for the Mac has always been about embracing breakthrough innovation and having the courage to make bold changes,” Cook said. “Every time we’ve done this the Mac has come out stronger and more capable and I have never been more confident about the future of the Mac than I am today.”
Of course, we will not really know the results for several years. It seems highly likely that Apple will successfully move its developers to the new platform, but whether the new MacBooks can be as fast or faster than Intel or AMD-based laptops or as power efficient over a long period of time is an open question. Given that Apple is nowhere near as dependent on the Mac as it was when it made the previous moves, it’s still a risk, but one the company can afford to take.
Get your CompTIA A+, Network+ White Hat-Hacker, Certified Web Intelligence Analyst and more starting at $35 a month. Click here for more details.