Arm-based PCs Q&A

Something very new, interesting, and maybe just a little bit scary is starting to happen ever-so-slowly to traditional laptops and desktops – the transition to Arm! This Q&A will tell you all about it!

What is Arm?

This is going to get a bit abstract for a bit, so please bear with me, okay? Take it slow and read this section a few times!

Arm is a type of instruction set – or language – for computer processors. You may have heard of binary numbers – 1s and 0s, being the language that computers understand, but it’s probably more correct to say that those 1s and 0s are an alphabet. Think of how the same alphabet is used to write English, French, Spanish, and Dutch even though these languages are not mutually understandable because those same letters are used to form very different words. In the same way, the 1s and 0s of the binary alphabet are used for many different instruction set languages. It’s the very different words of these very different languages that are used to describe to a processor what it is supposed to do next, like draw something on the screen or make an alien start firing its laser gun at you!

Unlike people, who can be bilingual and learn different languages as they grow, processors are manufactured from the start of their lives to only understand one specific instruction set and that’s that! It’s fundamentally how they are structured and so that’s the only thing they are built to use.

When programmers make apps, they initially write them in something called source code that is easy to look at and edit. When they are ready, they can run something called a compiler that copies and converts that source code into what’s called a binary or build, that will be written in a specific instruction set and this is the thing that actually gets installed and run on your phone or laptop. Programmers can make multiple different builds for different instruction sets so that different kinds of processors are able to run their program when they feel it’s needed. Other times, they might only make binaries for one particular instruction set. It’s a lot of work to adjust source code to make it compile correctly for different instruction sets, so they only want to support as many different instruction sets as they have to.

The processors found in nearly all tablets and smartphones have been made for the Arm instruction set for ages. It’s a good choice for devices like this because it is really power efficient, generating very little heat. This is the main reason why you’ve never seen a super-thick smartphone with huge vents on the back and a howling fan trying to keep it cool – it’s thanks to the way the processors inside phones use the Arm instruction set. Not having to power a spinning fan adds even more to how long your battery will last.

Until recently though, laptops and desktops haven’t used Arm. Instead, their processors tend to use the x86 instruction set. The x86 instruction set language is made up of lots of bigger, more complicated words that let processors do things in fewer steps, and that makes it easier to make the processor run really fast. That’s important because people tend to demand a lot more performance from a desktop or laptop than they do their smartphones. But because the x86 instruction set is more complicated, x86-based processors end up using a lot more power – and generating a lot more heat – trying to accomplish the same thing compared to their Arm-based counterparts. That’s why laptops tend to have such terrible battery life compared to smartphones. It’s also why they tend to be a lot noisier, especially when you have them do a lot.

With these differences, you can kinda see why it makes sense to make smartphone and tablet processors Arm-based, and to make laptop and desktop processors x86-based. And that’s why this is the way it’s been for decades. But that’s exactly what’s about to change. The age of the Arm-based PC has begun!

So people are making Arm-based desktops and laptops now? Why?

As I said, laptops and desktops need to be fast enough to do what people need them to do. That’s why it’s been worth the sacrifice to power efficiency to give them x86 processors. For the longest time, we needed to do that in order to make them fast enough.

But computers have been getting faster and faster in general over time. At this point, even the cheapest computer is capable of doing what most people need it to do. Part of why you see more and more laptops these days compared to desktops is because most people just don’t need that much relative power anymore. It’s also why people are holding onto their old computers for longer, because they are remaining useful for longer. The biggest competition a computer manufacturer has to fight with is the computer you already have. Why would you give them money for a faster computer when your existing computer is already more than fast enough? But what about if they could offer you a computer that’s just as fast, only it runs much quieter, has a battery that lasts much longer, and doesn’t need to be so bulky and heavy because it doesn’t need a huge fan to get rid of excess heat? So the market pressure is changing over time to be less about making PCs faster than before, and more about making them the same speed as before, only more mobile.

Meanwhile, there have been major strides in research and development that is letting people make Arm-based processors that are much, much faster than they used to be while still preserving most of their energy efficiency and heat advantage. Apple in particular has shown you can make Arm-based processors that are actually faster than most x86-based processors people use today, all while retaining the advantages of Arm at the same time!

This means more and more, we are seeing a market incentive to try making PCs use Arm-based processors instead of x86. Companies follow the money! 💰

So my very next laptop will be thinner, lighter, quieter, and last longer without needing a charge. Sounds good. What’s there to worry about then?

Remember earlier how I said that processors can only understand programs made for their instruction set and nothing else? Think about it: an entire ecosystem of programs for desktops and laptops going back many decades has all been made for x86, because that’s all there was on such systems anyway. Your shiny new Arm laptop wouldn’t be able to run any of that, and that’s a problem! So much for being able to play your favorite computer game, run your word processor, or even start up Windows itself! Yikes!

And if you can’t run anything on an Arm laptop, why would you buy it? And if no one wants to buy one, then no one would be using one. And so then why would programmers ever bother to compile Arm builds of their desktop/laptop-focused programs if there’s no one around needing them? It’s a chicken-or-egg problem that needs a lot of work and creative solutions to get around. It’s also something that is going to take a lot of time to overcome – think decades!

So what’s being done to make Arm PCs feasible?

One thing that is happening now is that people who make operating systems, like Windows, Linux, or macOS, are starting to compile Arm builds of those operating systems, so that they can be run on Arm-based PCs. That’s taken several years because a lot of changes to the source code needed to be made and a lot of testing needed to be done!

These Arm versions of operating systems also need programs to run in them, and the programs mostly aren’t here yet. So what everybody is doing is making what’s called binary translation software. The way it works is you try to open some x86 program on your Arm PC that normally couldn’t be run. When you do that, the binary translation software works behind the scenes to translate all the x86 instructions in the program into Arm instructions. Your processor can then follow those translated instructions, and the program will work.

Naturally though, there are gotchas to doing this. In the same way something can get lost in translation from Spanish to English, translating x86 programs in this way can result in things not working as intended, and programs can malfunction and crash. It’s also a lot of work to convert all these instructions one-by-one, and that work is being done by the processor like everything else. That makes for a lot of work for the processor to accomplish the same thing, and so everything will run more slowly and consume more power. It would be much better if there were just Arm builds of programs available instead, but these binary translators are a good stepping stone bandaid fix that will allow Arm desktops and laptops to be usable in the meantime. Over time, fewer and fewer programs will need binary translation to work as the incentive grows for programmers to make Arm builds for any new programs they make.

Now all that does make things sound a lot uglier than they actually are at this point! Binary translation in the popular operating systems has gotten amazingly good by now already. Very few programs break anymore, and unless you are trying to run a high-end game, you probably aren’t going to notice much of a performance hit. And things are going to keep getting better! Already, many popular programs are becoming available in Arm versions. So there’s reason to be optimistic!

Is this in any way related to the transition to “64-bit computing” I heard about many years ago?

That was an instruction set transition too, but a much easier one!

Back in the mid-2000s, desktops and laptops had a problem. They were all using the x86 instruction set, which at the time was only a 32-bit instruction set. Among other things, this meant it had only 32 binary digits that could be used for describing different spots in memory by number. Since bigger numbers don’t fit, the software wasn’t able work with any spot in memory beyond the first 4GB of a computer’s RAM, effectively limiting computers to only 4GB of memory. That was becoming a problem as people’s memory needs continued to grow.

So a processor company called AMD developed a variant of x86 called x86_64, or x64 for short that made it into a 64-bit instruction set instead, which solved this problem. The old 32-bit x86 was retroactively named x86_32. Don’t you just love the way geeky people name things?

The reason why this transition wasn’t quite as painful as Arm is now is because x86_64 was totally backwards compatible with x86_32. No special binary translation software was needed, and x86_64 processors could still understand the shorter x86_32 instructions just fine, sort of like how adults can still read English instructions written for a small child. You could even have a computer with an x86_64 processor use an x86_32 build of Windows, and that would work without any problems. To actually be able to use more RAM though, you still needed to run an x86_64 build of Windows, which Microsoft had to make. Some programs started to come in x86_64 builds, while others didn’t, and Windows needed to supply two entirely separate copies of support libraries so that either type of program could be run.

The messiness of this still exists in Windows today. Even now, many Windows programs still only come in so called “32-bit” (x86_32) versions, often because they just don’t need more than 4GB of memory to run well. Other programs only come in “64-bit” (x86_64) versions. Still others give you a choice by producing separate “64-bit” and “32-bit” builds, and letting you choose which one you want to download. This way, if you have an older computer that still only has an x86_32 processor, you can still run their program. Since such computers are nearly 20 years old at this point, this is becoming less and less of a thing. Windows 11 is the first version of Windows to no longer supply x86_32 builds of the operating system anymore because any computer that would need it would be too old to be able to run Windows 11 anyway. It still has the support libraries needed to run x86_32 programs though.

At least we won’t have to worry about this kind of thing again. The version of the Arm instruction set PCs are transitioning is already 64-bit, which allows software handle up to around 17 billion gigabytes of RAM before higher values can’t fit. So I think we’ll be fine there for a while, and no one is going to be talking about the wonders of “128-bit computing” any time soon! 🙃

How is the Arm transition being handled by Microsoft Windows?

Windows 8 back in 2012 was the first version of Windows to come in an Arm version, and the first Arm-based Windows PCs were made at around that time. But the Arm version of Windows 8 wasn’t really the same product (they gave it the different name of “Windows RT”) and was very limited in functionality. It had no binary translator at all, so no existing Windows programs worked. The market responded poorly to the idea, and Microsoft ultimately abandoned the Arm version completely. To this day, they are still dealing with the reputational damage this caused to the idea of an Arm-based Windows PC.

With Windows 10 though, they tried again, and things went much better this time! They’d learned their lesson and now they knew to include a binary translator, even if it was limited and could only work with x86_32 programs. These early efforts were expanded in Windows 11, and by now, the Arm version of Windows is fully featured and equally supported alongside the traditional x86 version.

An Arm-based PC running Windows 11 should be able to run nearly any x86 program well, so long as that program doesn’t use virtual device drivers (like creating a virtual microphone or CD drive) and so long as that program runs separately in its own window, without trying to integrate with the rest of the system. So something like a computer game should work okay, while something that adds new functionality to the Start menu will almost certainly not work. There are always going to be rare exceptions though and sometimes a program just won’t run for no apparent reason. And of course, binary translation is slow, though it’s not noticeable at all for simple programs.

A surprising number of Windows programs, including Google Chrome, Microsoft Office, Notepad++, 7-zip, etc., are already available in Arm builds and no longer need binary translation. The most significant holdouts have been high-end games. This is because nearly all Arm-based computers in existence have so far been hyper-portable laptop computers that couldn’t run these high-end games anyway. The situation will change I think as Windows Arm processors catch up to Apple and become faster, letting higher and higher-end PCs go Arm.

Binary translation isn’t available for device drivers, so any hardware peripherals complicated and unusual enough to need their own support software installed are only going to work if they explicitly support Arm. Fortunately that’s already become pretty common. Logitech’s Logi Options+ software for example became Arm-compatible in 2023.

If you were going to buy a Windows laptop today, I’d still recommend it be an x86 laptop, mostly because the companies making Arm processors for Windows are very behind Apple and so Windows Arm processors are still too slow. But I think the situation will be very different only a few years from now, and I think if you were to buy an x86 laptop today, it would probably be your last one! As for desktops, they don’t really benefit from longer battery life so I think they will be the last to switch over. I wouldn’t be surprised if Arm-based Windows desktops still aren’t widely available at all even a decade from now or longer!

How is the Arm transition being handled by macOS?

Apple’s transition to Arm is directly tied to their transition away from using Intel-brand processors in their Macs (which were x86-based) to using their own in-house technology (which is Arm-based). And you’d be shocked how much further along they are! 😵‍💫

By 2020, Apple had grown really frustrated with Intel’s inconsistent release schedule that was embarrassing them and forcing them to delay new Mac models. To break free of their dependence on Intel, Apple started making their own desktop and laptop processors, which they called Apple Silicon. The first Apple Silicon Macs were sold in 2020 and by 2023, Apple had successfully transitioned their entire line away from Intel, so that no new Intel Macs are being made anymore. That means today’s Macs are all 100% based on Arm, from their most portable laptops to their most powerful desktops. While Windows is only just getting started, Apple has already left x86 behind completely!

Apple was able to do this so quickly because of the huge advantage they have in controlling both the hardware and the software. No collaboration was needed between a pile of different companies for there to suddenly be an Arm-based Mac, running an Arm version of macOS. It could just happen because they could do it all by themselves. For better or worse, Apple also has a long-running reputation for being unafraid to drop support for older technologies quickly. Programmers know that when something new comes out, they need to update their apps to support it right away or they risk becoming incompatible with a later release of macOS. For this reason, the majority of popular Mac apps offered Arm builds less than a year after the first Arm-based Macs came out, and by now, there are very, very few holdouts left.

For those few holdouts, you can still run them using binary translation software (Apple’s version of the concept is called “Rosetta 2”). If Apple sticks to its long-running policies on legacy support, we can expect Rosetta 2 support to be intentionally removed in a future release of macOS. At that point, any x86 Mac programs won’t run at all anymore, but that’s likely at least a few years away still.

As for the older, x86-based Intel Macs, Apple might not be making any new ones, but they still support the existing ones for now to give people time to get full use out of their purchases before they are ready for new machines. The current release of macOS, version 15 “Sequoia”, continues to be distributed as both an x86 build (for Intel Macs) and an Arm build (for Apple Silicon Macs) so that owners of both kinds of Macs can continue to enjoy the latest features! This is not a policy that will last forever, and it’s quite likely that macOS 16 – to be released in September/October of 2025, will only come with an Arm build. But we’ll know more when the details of macOS 16 are announced to the public in early June of 2025! Whatever Apple decides to do, support for Rosetta 2 is likely going to greatly outlast support for Intel Macs themselves.

If you want to find out if a particular Mac program offers an Arm build, the huge database at the web site Does It Arm is super helpful! 🦾

How is the Arm transition being handled by Linux?

Linux – and the other software components that make up Linux desktop operating systems – have always been designed to be instruction set agnostic, and have always been available for a stunning number of super obscure instruction sets nobody’s ever heard of, including Arm. The first Linux distribution to offer an Arm build was Debian 2.2 Potato, released on 15 August, 2000!!! 🤯 Arm builds of Linux became popular with the first Raspberry Pi devices – which are ultra-low-cost, Arm-based hobbyist computers that primarily run Linux.

Most Linux programs are also “open-source”, which means the source code used to write them is publicly available. So if no one’s made an Arm build of your favorite Linux program, you can usually just compile an Arm build for it yourself!

It’s not all flowers and sunshine though. Almost no widely used closed-source Linux programs offer Arm builds, including Steam, so that’s a no-go. While there are binary translation solutions for Linux, none of them are pre-configured in any Linux distros, and they can be exceptionally slow and unreliable and very complicated to set up! Several of the most popular Linux distributions, such as Linux Mint and KDE Neon, still don’t offer an Arm version for download. All that being said, you still have plenty of good options if you want to run Linux on an Arm PC, and Linux is prepared for such a future.

How does the transition to Arm affect the use of Virtual Machines?

More information on Virtual Machines is available in my separate Q&A on virtual machines, but for now, the TL;DR version is that Virtual Machines (VMs) are software simulations of computers that run inside your real computer. You can do nearly anything with them that you can do with a real computer, with the virtual machine’s virtual monitor typically appearing in a window on your real computer’s desktop. That might sound like a weird and abstract idea, but there are a great many uses for doing this, including things casual users can benefit from. You can create a virtual machine that runs a really old copy of Windows for example so that you can still run your old programs. You can also create throwaway virtual machines where you can experiment and do risky things you’d be too afraid to do to your real computer in order to learn to grow more comfortable with computers in a safe way! I highly recommend you check out the separate Q&A and give them a try!

For a reason not known to this purple demon, binary translation features in operating systems don’t work on something like virtual machine software. So if you have an Arm-based computer, you will only be able to install and run a virtual machine program if it has an Arm build available.

The other problem is that virtual machine software doesn’t normally simulate a processor that is different from your real one on the machines it creates for you. So if your real computer has an Arm-based processor, your virtual machines are all going to have Arm-based processors too. That means they’ll only be able to run Arm-compatible operating systems like Windows 11, so you won’t be able to create something like a virtual machine that runs Windows XP or MS-DOS.

That’s not to say virtual machines are in any way more limited on an Arm computer – at least in theory. After all, the reverse would also be true – if you had an x86-based computer, you wouldn’t be able to create Arm-based version machines. In practice though, nobody really wants to create an Arm virtual machine on an x86 computer, but people have lots of reasons to want to create an x86 virtual machine on an Arm-based computer. So this can end up being a big disappointment.

To try and meet this demand, some virtual machine software also includes ’emulation’ functionality that lets you create and run an x86-based virtual machine on your Arm computer, but this is asking a lot of reality and it’s hard to do this well. To date, I don’t know of any virtual machine solution that can do this without serious compromises, like reduced performance, functionality, and stability, in the rare cases where software is able to do it at all. I do feel optimistic that the situation will improve over time though! Humans can be pretty clever about solving difficult problems to make something like this work!

When all is said and done, will Arm-based PCs really improve my experience all that much?

I can speak from experience here. It’s not going to change your life, but it will definitely make things better!

My firsthand experience with Arm-based PCs began in 2012. I was working with both a Microsoft Surface Pro (which was x86-based) and a Microsoft Surface RT (which was Arm-based). At the time, Arm processors were just too slow, making the Surface RT frustrating to use. And this was before Arm builds of Windows came with binary translation or any software at all, so you had to make do with whatever came with Windows and the built-in copy of Microsoft Office. All that said though, I really felt the huge difference when it came to heat, weight, noise, and battery life, and I could really see the potential of where this was going.

In November of 2020, I switched my main computer from a 2018 Intel i9 MacBook Pro (x86-based) to an M1 Max MacBook Pro (Arm-based). The battery life was so much better that I was able to completely change my routines for how I used my computer. With what a miserable, howling hotbox the 2018 MacBook Pro was, it also felt incredibly weird in a fun way how my new computer stayed totally cool and quiet no matter what I did with it. The only exception – the only time I was able to get the fan to come on at all – was when I played with Stable Diffusion AI image generation. For everything else I did with it, from high-end gaming to running piles of virtual machines at once, it would always sound exactly the same as it did when it was turned off. It would feel just as cool too! And it accomplished all of this while being overwhelmingly faster and more responsive than my older, x86-based Mac. It felt like science fiction!

I’ve never used an Arm desktop, but since desktops don’t have battery life to worry about, since size and weight are not very important to them, and since they’ve have gotten pretty quiet these days already, I don’t think the improvement is going to be all that dramatic there. With how more and more people just have laptops these days, I’m guessing what will ultimately happen is that Windows desktops will transition to Arm only once they need to in order to remain compatible with the same software everyone is using by then on Windows laptops. At least the improved power efficiency will make desktops better for the environment!

The Arm transition is going to involve a lot of pain, expense, and inconvenience, but yes I do think it’s worth it, and when we finally get to the other side of this, I feel confident you’ll agree with me!

Conclusion

This is a pretty abstract concept, and maybe it’s overambitious of me to be trying to explain it all in plain English. Please let me know how I did in the comments, and I will make changes based on feedback. If you want to contact me privately about something, please see the contact section. Bye for now! 🦇