For sure in the area of coding there are huge changes ahead.
Example: I'm not a coder. But sometimes I want to solve my trivial problem using some scripting. Believe me, designing and writing all these loops and conditions was a real mind-training. And definitely not very funny. And took a lot of time that I could spend better, working on other parts of my project. Before I had to do the same, what professionals do in many jokes: I had to read eg. stackoverflow and find crumbs useful for my solutions. And sometimes (as eg. nowadays, when I work on fontforge scripting) there were no solutions, so I had to read documentation, which usually is realy easy - for coders, not for me. For several times I had tried to ask for help. Only once I got answer - the topic was "fresh" enough, but in other times I had waited for weeks and there were silence.
Now just I have to write something like "ladder" or conspect: first do this, then do this. I can even ask to design certain functions, so I can control data flow. But I don't have to code. I just need to understand the code enough to interact with really fast autist (you know, like Rain Man). What before took me weeks, now takes me hours.
So, your prophecy really can fulfill. It is sad, that subtle network of mutual help will disappear. And when it will vanish, also the free help from LLM will fade. Why should it remain free if there will be no free competition?
I really hope Neal Stephenson's "bogon" (from Anathem) catches on for this stuff.
On a more serious note, I think certificate chains (cleverly hidden behind an intuitive UI) that let people judge the source/path of information is the only hope.
A slightly less bleak take: one possibility is that things like the LLM might simply raise the level of abstraction of human interaction with technology. Something like how computer user interface abstractions have evolved over time so that we never have to think that what we are ultimately doing is directing electrical voltages. True creativity, whatever that means, seems to still be the domain of human ingenuity. ChatGPT-like technology might free humanity to not “worry the details” so to speak.
Or maybe I am just influenced by the fictional human computer interaction portrayed on Star Trek Voyager that I am binging through right now. The idea of “programming” a holodeck experience by simply providing a description of the end result seems within reach instead of worrying about 3D modeling, rendering, GPUs and so forth.
On that topic: https://www.ic3.gov/Media/News/2024/240709.pdf
For sure in the area of coding there are huge changes ahead.
Example: I'm not a coder. But sometimes I want to solve my trivial problem using some scripting. Believe me, designing and writing all these loops and conditions was a real mind-training. And definitely not very funny. And took a lot of time that I could spend better, working on other parts of my project. Before I had to do the same, what professionals do in many jokes: I had to read eg. stackoverflow and find crumbs useful for my solutions. And sometimes (as eg. nowadays, when I work on fontforge scripting) there were no solutions, so I had to read documentation, which usually is realy easy - for coders, not for me. For several times I had tried to ask for help. Only once I got answer - the topic was "fresh" enough, but in other times I had waited for weeks and there were silence.
Now just I have to write something like "ladder" or conspect: first do this, then do this. I can even ask to design certain functions, so I can control data flow. But I don't have to code. I just need to understand the code enough to interact with really fast autist (you know, like Rain Man). What before took me weeks, now takes me hours.
So, your prophecy really can fulfill. It is sad, that subtle network of mutual help will disappear. And when it will vanish, also the free help from LLM will fade. Why should it remain free if there will be no free competition?
I'm sure we'll know when LLMs replace human customer support: once we start seeing complete formal sentences with correct grammar :)
I really hope Neal Stephenson's "bogon" (from Anathem) catches on for this stuff.
On a more serious note, I think certificate chains (cleverly hidden behind an intuitive UI) that let people judge the source/path of information is the only hope.
A slightly less bleak take: one possibility is that things like the LLM might simply raise the level of abstraction of human interaction with technology. Something like how computer user interface abstractions have evolved over time so that we never have to think that what we are ultimately doing is directing electrical voltages. True creativity, whatever that means, seems to still be the domain of human ingenuity. ChatGPT-like technology might free humanity to not “worry the details” so to speak.
Or maybe I am just influenced by the fictional human computer interaction portrayed on Star Trek Voyager that I am binging through right now. The idea of “programming” a holodeck experience by simply providing a description of the end result seems within reach instead of worrying about 3D modeling, rendering, GPUs and so forth.