Energy in electronic circuits
What compels electricity to travel from A to B, why does it matter in PCB design, and how much attention to pay to Reddit quips?
In the 1980s and 1990s, the rules of printed circuit board design could be largely summed up as “anything goes.” Since then, the growing use of ultra-fast integrated circuits, high-speed digital buses, and low-voltage ICs necessitated a bit more rigor, both to keep the devices stable and to comply with the stringent (if poorly-enforced) RF interference regulations set by the FCC.
Some of these new principles of industrial design are spilling into the hobby world, but they often arrive in a garbled form. There is no shortage of Reddit commenters who insist you can never use 90° turns for PCB traces, or that you need a four-layer board with ground planes, even if all you’re doing is blinking some Christmas lights. Another example are various riffs on the maxim stating that “electricity flows in the spaces, not in the traces,” as shown in the snippet below:
So, what’s going on here? The saying is confusing at best, or outright wrong if you parse “electricity” in the most intuitive sense: the motion of charge-bearing particles from point A to B. The movement of electrons in modern electronic circuits is largely constrained to conductive materials, such as copper wires and doped silicon, and there isn’t much of that going on in the surrounding space — at least not since we ditched vacuum tubes.
At the same time, we know that the flow of electricity usually involves the transfer of energy to perform useful (or not-so-useful) work — and it is worth pondering how this energy travels from its source to the destination. Conduction-band electrons drift through the wire at a snail’s pace and don’t have a whole lot of kinetic energy; that said, they constantly interact with each other through electromagnetic fields, forming what can be thought of as a pressure-equalizing electron gas. Imagine you’re depositing extra electrons at one end of a wire; just like inflating a balloon, this requires the expenditure of more and more energy to fight an increasingly unbalanced electric field. If you later provide a conductive escape path for the crammed electrons, the energy in the field will be promptly released in some other direction.
A related phenomenon, explored in the writings of Violent J and Shaggy 2 Dope, is the magnetic field that soaks up energy when you try to get electrons moving, and then keeps them going for a while when the external electromotive force goes away. Although this might sound like the description of ordinary momentum, this motion-related field extends to infinity and isn’t bound to any single particle; it can act on other electrons some distance away. As such, magnetism has no satisfying explanation in classical terms (but it can be explained as a consequence of length contraction for moving charges in the context of special relativity.)
Either way, the important point is that electromagnetic fields, unlike the electrons themselves, are not constrained to the conductor; indeed, the vast majority of energy in an electronic circuit is contained not in the wires, but in the surrounding non-conductive (dielectric) space. Some specialty dielectric materials that reversibly reorient their subatomic structure in response to external fields can be used to enhance this effect, and that’s the operating principle of capacitors and inductors. That said, the same phenomenon also arises on a more modest scale in the surrounding air and in the PCB substrate.
In most introductory texts, we are taught that these parasitic capacitances and inductances are negligible. For low-frequency circuits, this is often true: the creation and the collapse of the parasitic fields soaks up only a tiny bit of energy for a very brief moment before reaching a voltage- and current-specific plateau (“saturation”) — so any signal lag caused by these effects is hard to notice.
But all this breaks down when working with high-speed signals. Before long, the supposedly-imperceptible delay becomes quite large in proportion to the desired signal rise and fall times; in practice, by the time you cross bus speeds of about 50 MHz or so, you are expending a fair amount of energy fighting the parasitics, and your square waves start looking more like sines. Once you get past 100 MHz, the effects can be significant enough to cause signal integrity issues on poorly-designed boards.
One possible answer to this problem is to just push harder: use higher voltages and supply higher currents to overcome the medium. Alas, this is getting more difficult in the era of digital chips that might run off 1.8 V or even less. Just as important, brute force can make things worse elsewhere in the circuit. The resulting rapidly-alternating fields don’t act just on the PCB trace they originated from, but can push electrons around within any of the nearby lines, especially on a densely-populated board. It is true that in each cycle, the fields can deliver only tiny quanta of energy — but if they do so hundreds of millions times per second, the effect adds up.
One particularly sneaky phenomenon associated with such AC coupling is the emergence of a meandering return current on the other side of the board: an induced flow of electricity, opposite in phase to the driving signal, that hugs the outline of the original high-speed signal line, following an energetically-favorable route that can be very different from the designer-provided “DC” path to the ground.
At such speeds, the design of your circuit board begins to matter. Solutions to potential circuit woes may involve keeping high-speed lines short and sufficiently far apart, using differential pairs that exploit mutual AC coupling to lower impedance, or placing an unbroken ground plane (copper fill) on the other side of the PCB to provide a universal return path for digital signals (and some RFI shielding too.)
On the flip side, for hobby projects that typically deal with signals well below 50 MHz and that run off higher supply voltages (usually 3.3 or 5 V), these precautions are usually unnecessary. There are other signal integrity concepts that offer far more bang for your buck, such as separating sensitive analog circuitry from digital signals, or using well-placed decoupling capacitors and ferrite beads to reduce noise.
Either way, behind every garbled Reddit maxim, there is a grain of truth — even if it’s not relevant to the project at hand.
If you liked this article, please subscribe! Unlike most other social media, Substack is not a walled garden and not an addictive doomscrolling experience. It’s just a way to stay in touch with the writers you like.
For more articles about electronics, click here.
This is definitely a dumb question (no background in signal processing), but what types of use cases are there for lines > 100MHz? I'm assuming a lot (maybe most?).