5 Comments

It's a little bit unfair to compare old CGA-style graphics in terms of how many colors can be used *per pixel*. Old style CGA graphics often depended on the blurryness of the pixel when displayed on the CRT screen, so sometimes more than 1 pixel was used to produce a color at some specific spot. CGA graphics was abysmal when looking at it on LCD, but with the right hardware, everything falls into place. Of course, it's still far from 16-bit graphics, but let's compare with our eyes as the POV, not the pixel.

https://www.youtube.com/watch?v=niKblgZupOc (from 4:54)

Expand full comment
author
Feb 27, 2023·edited Feb 27, 2023Author

You know, several folks made that point on Twitter, and I might have accidentally nerd-sniped them by not including a lengthy discussion of these effects, but it really feels orthogonal here?

Scanlines made low-resolution graphics look a bit better, but the effect is often cherrypicked and exaggerated in videos; especially at resolutions such as 256×192, you could most certainly see individual pixels just fine, and they spanned multiple scanlines. Artifact color and other "extra color" techniques were not at all common in the 8-bit era, and didn't work consistently across video systems (NTSC / PAL). I don't believe there was a single ZX Spectrum title that used it in the platform's heyday, for example?

While all that would make for an interesting subject for another article, I'm really not sure it changes the thesis here: 8-bit graphics looked awful due to memory constraints that severely restricted palettes, and what games are imitating now is the 1990s style of late 16-bit and early 32-bit platforms.

My point isn't linguistic pedantry, it's just a nice opportunity to explore what actually made 8-bit platforms bad, as it had little to do with the number of bits.

Expand full comment

Take a look at Atari 800xl games, they used a wide gamut of colors. https://www.atariarchives.org/agagd/chapter1.php

Expand full comment
Feb 28, 2023·edited Feb 28, 2023

I don't disagree with the core argument of your post. I agree that today's "8-bit look" is more similar to graphics produced in the 16-bit era. I'm just commenting on the details.

Every time dirthering was used, it was because of the fact that CRT will blur the pixels, and will display the surface with some different color illusion. It wouldn't be a perfect pixel, but sometimes the illusion would be good enough. Dirthering is used also in the pictures in your post. Also ZX Spectrum might not be the best example of the "8-bit era", because "8-bit" is an umbrella term spanning multiple generations of machines. Latest generation was e.g. NES, and games for NES look very different from ZX Spectrum titles. ZX also did not have support for sprites, while most 8-bit games wouldn't be possible to produce without sprite support.

Another thing is that while it's not wrong to say that the memory was a very significant limiting factor for graphics quality, I think one has to wonder if that's the core reason. If somehow the memory sizes would be bigger, would the situation change much? I have my doubts, since the CPU wasn't fast enough to update the whole screen in one frame anyway (that's why the sprites were invented). E.g. the VIC-2 graphics chip (used in Commodore 64) has been designed to interface with 6510 CPU, taking its clock speed into account. While it maybe wouldn't be a problem when someone wants to draw lots of colors on the screen (addressing via 8-bit registers can be solved by bank switching), it would be a real problem if we'd want to animate it; because several seconds would need to pass before one frame could be rendered.

Expand full comment

Yeah, good point. I think those pictures look particularly bad though? It was possible to use custom characters as tiles to draw nicer things:

https://www.c64-wiki.com/wiki/Ultima_III_%E2%80%93_Exodus

Expand full comment