The 8-bit (MCU) manifesto
Eight-bit microcontrollers aren't relics; they are a design philosophy. You might want to use them more.
Last week, my article about AVR Dx microcontrollers made it to the front page of Hacker News. Many commenters on the site reminisced about the good old days of 8-bit microcontrollers — and then advised others to let go of the past. Instead of clinging on to the AVR architecture, they recommended prototyping with Orange Pi 5 Pro. Or perhaps buying an Adafruit ESP32 board. Worst-case, you should be giving some thought to STM32G0 or CH32V003.
All of these recommendations are right — and all of them are wrong. Oversimplifying a bit, the market for general-purpose embedded processors can be divided into three major segments: 8-bit MCUs, 32-bit MCUs, and systems-on-a-chip (SoC / SBC).
The segmentation isn’t about progressing from the worst to the best. Instead, it goes like this:
8-bit MCUs are designed for process control — that is, taking data from sensors and input surfaces, and then operating small displays, motors, or other output devices according to some straightforward plan. The chips are orders of magnitude faster than their 1980s progenitors, but don’t have a lot of RAM — simply because it’s not needed for the job. Instead, the MCU die is crammed with gadgets useful for instrumentation: op-amps, voltage comparators, top-notch ADCs and DACs, multi-voltage I/O ports, and so on. The category is geared toward low power consumption and a supremely predictable and intuitive real-time operating environment.
Most 32-bit MCUs are designed for memory- and CPU-intensive tasks: processing video or audio streams, high-speed communications, and so on. They might have a decent amount of RAM, a DMA controller, a MIPI interface, network transceivers, hardware crypto, and an FPU. On the flip side, the instrumentation features usually lag behind; on most lower-cost chips, you’d be lucky to get some 5V-tolerant pins. Predictable execution is sacrificed for instruction pipelining, data prefetching, write caching, and so on. All the added complexity also means that something as simple as changing the clock speed on a Cortex-M7 chip can be a small ordeal.
Finally, SoCs are just desktop computers in disguise. They will usually have a high-performance 32-bit or 64-bit processor running at gigahertz speeds, gobs of RAM, an integrated GPU, Bluetooth, and so forth. Their primary niche are multifunction Linux-based devices: smartphones, gaming consoles, or entertainment systems. On the flip side, a SoC is an expensive and clunky way to do process automation or signal processing. There are gotchas that come with having a million lines of code standing between you and that I/O port.
Nothing requires the use of an 8-bit ALU for process automation; it’s just that there are limited benefits to designing a separate “lo-fi” 32-bit platform for process control when comparatively simpler 8-bit architectures work fine. To be fair, there are some barebones 32-bit RISC-V and ARM Cortex-M0 contenders in the ring — but in the end, it’s not necessarily clever to choose more bits. It might help, it might hurt if it pushes you toward the wrong product — but most often, it makes no difference at all.
We could wrap up here, but wi-fi throws a wrench in the works. Most MCUs don’t support it, probably because it’s a complicated and computationally-intensive protocol that could be handled by a companion chip. Enter Espressif: the company came up with a series of wi-fi chipsets that could be interfaced to standard microcontrollers. But they also had an epiphany: if they need all this computing power for wi-fi, they might as well carve out some room for user-supplied code — and effectively start selling dirt-cheap 32-bit microcontrollers with wireless connectivity baked in.
Because of this, a lot of internet-enabled process automation and simple compute is done with Espressif chips such as the ESP32 series — even though the basic models are not particularly well-suited for either job. But hey — ideological purity is for suckers, and free wi-fi is a steal.
For a catalog of my articles on programming 8-bit and 32-bit MCUs, click here.
As a hobbyist who's more comfortable with programming than with designing circuit boards, I've only used USB-programmable single-board devices. My current favorite is a Raspberry Pi Pico, or a Pico-W for WiFi. So, what the chip can do is less directly relevant to me than what the board can do.
These 32-bit chips are pretty complicated, but it seems like that doesn't mean the board or programming environment have to be? And since they have through-hole pins, I can solder socket headers onto them, or maybe something else if I could decide on a better connector. (There are too many to choose from; after reading about many of them, I gave up.)
But sometimes I've wondered about a no-soldering solution: suppose there were a board with *two* USB ports on opposite ends, basically to act as a programmable dongle. It could plug it into the laptop with a USB-C connector and use the USB-A on the opposite end for whatever peripheral I wanted to plug into it. It could run without the laptop if plugged it into a USB battery or wall adapter, which means that I don't need to think about power supply issues either.
I guess this would seem pretty over-built to someone who works at a lower level, but the nice thing is that USB ports are designed to be hot-pluggable and people know what to do with them.
For me, there's an overriding reason to use an 8-bit MCU: through-the-hole packaging. I don't think I could handle anything like LQFP. Perhaps even more than that, though, it just feels _wasteful_ to use a full ARM core for something that can be easily handled by a PIC.