Apple’s move to ARM expected in ‘chip wars’

The reasons why Apple might start putting ARM chips in its laptops are legion. But is the rest of the world ready?
22 June 2020 | 1 Shares

Wiill Macs be the first mainstream laptops to use ARM? Source: Shutterstock

The Apple rumor mill is a full-time operation, supporting whole websites and news channels in even mainstream media. It subsists on the thin trickle of rumors and leaks emanating from the Cupertino company that’s known in industry circles for its scrupulous attention to secrecy – an aspect of the global giant that adds to its air of mystery with regards its next product launches and OS updates.

With the Apple (online only) WWDC (worldwide developers’ conference) due to start on Tuesday, the Apple-centric sites and “Apple-watchers” are in overdrive, making it more difficult than usual to try and separate likely announcements from hyperbole.

But what there seems to be a consensus on this year is that Apple is expected to make something of a transition in its laptop range (the MacBook Air and MacBook Pro, currently) to ARM chipsets. That means that it will, if not entirely phase out Intel chips in those products, be aiming to reduce its reliance on the US semiconductor giant.

Apple has already started to stray out of the Intel fold: the 16″ MacBook Pro sports fast AMD processors. That fact has aided the current chip industry approval of AMD, a company whose chipsets have met with variable reception in the past. But the company, whose 4000 range of low-power laptop chips are causing much excitement for their speed and low power consumption, are seen at present to be beating Intel in the CISC (complex instruction set computer) wars.

Apple already manufactures and sources its own ARM chips, of course – these power the iPhone and iPad already, and are found in the guise of the T1 and T2 chips in some Apple laptops, running the OLED display bar on high-end MacBook Pros and the ubiquitous security chips across the laptop range.

The advantages for Apple in shifting to the RISC (reduced instruction set computer) ARM chip are quite significant. Primarily, it would lose (at least partially) its reliance on Intel, whose supply chain woes have, in the past, caused Apple several headaches, with the company unable to ship according to demand for its powerful range of desktop and laptop computers. By controlling the manufacturing of its own ARM chips, Apple removes from its list of pain points the reliance on a third-party for a degree of its success, and of course, it will significantly lower its costs. Intel’s price mark-up per chip is history, and, of course, ARM chips are (to use a UK English phrase) “as cheap as chips” [French Fries].

Apart from the lower costs surrounding the use of ARM semiconductors, the chips were designed to have very low power requirements, and therefore create much less heat, and thereby require less cooling. That makes them perfect for use in portable devices: the phone on your desk or the one that you’re reading this article on uses ARM chips, as do most IoT devices. Everything from domestic TVs to the controllers found in air conditioners and just about everything that has an electric current these days runs on ARM. In fact, so ubiquitous and cheap is the ARM platform, that Apple also places them in its charging cables for phones, laptops, and tablets to help prevent unrequired power use, the firm says.

On the surface, Apple’s shift to ARM architecture is a win-win: lower costs, better supply chain control, fewer cooling components like fans needed in its products, and so on. Furthermore, the much-vaunted confluence of iOS that runs on iPhones and iPads with macOS that runs on Apple’s computer range suddenly takes a giant leap forward. Developers might soon, one would think, be able to develop one codeset, and deploy to the App Store for portables, and have the same app run with minimal tweaking on Apple computers, too.

However, the fly in the ointment is that ARM chipsets use a completely different low-level set of instructions to run software (a Reduced instruction set, not a Complex instruction set). That means every part of macOS and every application currently running on Mac computer hardware will have to be recompiled (at best) or (more likely) rewritten from the ground up. That’s no easy task, even for the operating system developers themselves. Every third-party Apple computer software creator will have to go back to the drawing board and reproduce their applications for the new architecture.

Apple has pulled this move before, shifting from PowerPC chips in 2005 to its current x86 architecture. For many years thereafter, developers would ship applications with two binaries in their packages, one for running on the then-outmoded PowerPC computers, one binary for the new, shiny Intel machines.

That move didn’t necessarily move many software developers away from the platform at the time – or at least, none that mattered particularly. Apple’s considerable heft in the walled garden of its own construction is such that if Adobe wishes to keep making Mac software, it has little choice but to bow meekly and get on with the task of rewriting that’s landed on its plate. That’s a process that will have indubitably begun already if Apple is committing to ARM: its non-disclosure agreements and many thousands of in-house lawyers meaning that no software house having to work (very) late at the moment will be bemoaning their lot. At least, not publicly.

If Apple does make the switch to ARM, expect other operating systems and hardware manufacturers to follow suit, at least in some form or other. However, the transition will be a slow one, even if Apple is leaping ahead to realize the advantages ARM offers. ARM Holdings (the chipset’s architecture owners and licensors) has serious designs on the data center market, so we would expect server OSes and software to be first to transition to the new, low-power, low-heat future, as soon as ARM hardware makes a bigger splash server-side.

Watch this space and join the many millions of global Apple fanbois on Tuesday to watch the WWDC keynote, being streamed to a screen near you.