Late last year, at the peak of the non-fungible token (NFT) craze, I purchased about a dozen highest-rated NFT books on Amazon:
I did this because I wanted to publish a balanced critique of NFTs; I figured the most honest approach would to get familiar with the best-articulated arguments put forward by the proponents of this tech.
I soon discovered that all the books leaned on the same central argument: value is an inherent consequence of scarcity. My rebuttal to that was simple: it’s an exception, not a rule. My kids’ doodles are scarce, but in a gallery, they wouldn’t fetch much. Alas, while this made for a solid Twitter quip, it wasn’t enough for an in-depth post.
I ditched the project, but I couldn’t stop thinking about one of the books: NFT for Beginners by “Matt Martin”. At the time of my purchase, it ranked #3 in Amazon search results and had hundreds of five-star reviews.
At first glance, the book is simply bad. Consider the following incorrect definition of a non-fungible token:
You can also have a sensible chuckle at this bit of investment advice:
Still, that’s not out of the ordinary: just another self-published work of a hopelessly misguided enthusiast (or a cynical opportunist). Perhaps unfortunate, but hardly a crime.
But then, as I kept flipping the pages, things started to get weird. Midway through the book, the narrative seamlessly shifted from talking about NFTs to discussing “NTF”, apparently an acronym for “Net Price Calculator” (?!) and a part of a supply-chain entity called the Strategic Development Group:
This incoherent babble continued for several pages, culminating in another seamless topic shift to the clinical trails for a chemotherapy protocol called “NFP”:
My first theory was that the writing must have been outsourced to a content farm, where multiple individuals worked on portions of the text without caring to understand the context. But the explanation had a flaw: there’s nothing you can find on the internet if you search for “NFP chemotherapy” or “NTF Strategic Development Group”. The text didn’t merely describe a garbled version of our reality; it invented its own.
And then, I had an epiphany: I was probably looking at the output of an ML-based language model, such as GPT-3. The models have a remarkably good command of a variety of niche topics, but lack higher-order critical thought. They are prone to vivid confabulation, occasionally spew out self-contradictory paragraphs, and often drift off-topic - especially when tasked with generating longer runs of text.
I do not have a proof, but I’m fairly confident I stumbled upon an early example of a monetized machine-generated book. It no longer ranks highly on Amazon and it’s been stripped of its hundreds of fake reviews, but I bet it served its purpose back in the day.
With GPT-3, we now have an infinitely-scalable technology that is years away from being able to enrich our lives, but is already more than capable of drowning out all remnants of authentic content on the internet. And because you can leverage this to earn money or sway opinions, that outcome is probably hard to avoid.