Scrolling through Instagram this morning, I found three ads explaining you can now use AI to rapidly produce blog posts and e-books to promote your business. You give the machine a prompt and it produces a block of text for you based on the information it has been fed. There’s no need to pay a writer and wait for them to turn out the required words; you simply generate them at the push of a button.
The most well-known of these text generators is a chatbot called ChatGPT, which was launched in November last year by California-based laboratory OpenAI. If you ask it a question, it uses online sources to respond with answers in the form of essays, articles, and even fictional stories. In essence, it works like a scaled-up version of the auto-complete function on a text message. If you type a word into your phone, the phone will suggest what the next word will be based on your past usage. ChatGPT operates in a similar manner, but at a far more complex level.
But what seems like a nifty piece of software could actually cause seismic change in a number of unexpected places. For instance, lecturers at UK universities are already voicing concerns about students using this software to cheat the system by producing high-quality essays, and writers of online content are suddenly fearing for their jobs as AI produces reams of content faster and for less cost than employing a human. Feature writers, journalists and columnists aren’t safe, either. In a few months, you might find this column is being written by a machine.
Creatives are also pushing back against the new technology. Visual artists have been up in arms for some time, claiming that not only are the book covers and pictures generated by AI taking away work from human artists, but that they’re also violating their copyright by using their existing art as the raw material on which the AI draws to produce its pictures.
In a few years’ time, it seems reasonable to suppose a text generator could be fed the complete works of Shakespeare and then asked to produce new Shakespeare plays. We could have new novels in the style of Charles Dickens, Ernest Hemingway or Jane Austen. But would they be any good? This week, the singer Nick Cave has spoken out about a song written in his style by ChatGPT, calling the artificially generated lyrics, “a grotesque mockery of what it is to be human.”
Written and visual art are ways of communicating our humanity. They are deep expressions of emotion and experience. An automatically generated piece can never replicate that essential function, as the machine doesn’t feel or experience things the way we do. It isn’t even aware what it’s writing any more than your phone understands what you’re texting. Therefore anything it produces can only ever be, to quote the Bard, “a tale told by an idiot, full of sound and fury, signifying nothing.”
I don’t think engineers are safe, either. Within a few years, it will be possible for these systems to draw on existing blueprints and engineering principles to automatically produce designs for bridges, cars, and tunnels. Today, they’re already capable of writing pieces of code, so software engineers might suddenly find themselves surplus to requirements.
Once the system has absorbed enough information, it will be able to produce anything—pasta recipes, haikus, architectural plans, textbooks…
And here’s where we brush up against a concept that was obsessing science fiction writers around the turn of the millennium.
When machines start to produce code, they may start to be able to design machines that are faster and cleverer than they are, and those machines in turn design better machines than them, until within a handful of programming generations, their complexity and speed will have improved exponentially beyond anything we lowly humans can understand. And eventually, there’s the possibility those systems will develop something analogous to self-awareness.
In previous columns, I’ve touched on automated resource gathering, transport and shipping, as well as automated battlefield weaponry. Once a computer system becomes able to control its own survival and the manufacture of more powerful computers without our help, everything changes. In the novels of Charles Stross and Vernor Vinge, this event is known as the Singularity. It is a point beyond which the future will be so different that we can’t currently imagine it. The genie will be out of the bottle, and it might no longer need us.
Gareth L Powell is an award-winning and widely lauded author at the forefront of speculative fiction. He has won the British Science Fiction Association (BSFA) Award for Best Novel twice, and been a finalist for the Locus, British Fantasy, Canopus, and Seiun awards. Find him at www.garethlpowell.com
Promoted content: Does social media work for engineers – and how can you make it work for you?
So in addition to doing their own job, engineers are expected to do the marketing department´s work for them as well? Sorry, wait a minute, I know the...