1 How an AI written Book Shows why the Tech 'Terrifies' Creatives
ojxcorrine2393 edited this page 3 months ago


For Christmas I got an intriguing present from a friend - my really own "very popular" book.

"Tech-Splaining for Dummies" (excellent title) bears my name and my photo on its cover, and it has glowing reviews.

Yet it was entirely composed by AI, with a few easy prompts about me supplied by my friend Janet.

It's a fascinating read, and extremely amusing in parts. But it likewise meanders rather a lot, and is someplace between a self-help book and a stream of anecdotes.

It simulates my chatty design of writing, however it's likewise a bit repetitive, and really verbose. It might have gone beyond Janet's prompts in collating data about me.

Several sentences begin "as a leading technology reporter ..." - cringe - which might have been scraped from an online bio.

There's also a mystical, repetitive hallucination in the form of my cat (I have no pets). And there's a metaphor on practically every page - some more random than others.

There are lots of business online offering AI-book composing services. My book was from BookByAnyone.

When I got in touch with the primary executive Adir Mashiach, based in Israel, he informed me he had offered around 150,000 customised books, primarily in the US, given that rotating from assembling AI-generated travel guides in June 2024.

A paperback copy of your own 240-page long best-seller costs ₤ 26. The firm uses its own AI tools to generate them, based on an open source large language model.

I'm not asking you to purchase my book. Actually you can't - just Janet, who developed it, can order any further copies.

There is presently no barrier to anybody creating one in anybody's name, including celebs - although Mr Mashiach states there are guardrails around abusive content. Each book includes a printed disclaimer specifying that it is imaginary, created by AI, and created "solely to bring humour and delight".

Legally, yogaasanas.science the copyright belongs to the firm, but Mr Mashiach stresses that the product is planned as a "personalised gag gift", and the books do not get offered even more.

He wants to broaden his range, producing different categories such as sci-fi, and perhaps providing an autobiography service. It's created to be a light-hearted form of consumer AI - offering AI-generated goods to human consumers.

It's also a bit frightening if, like me, you compose for a living. Not least because it most likely took less than a minute to create, and it does, definitely in some parts, sound much like me.

Musicians, authors, artists and actors worldwide have expressed alarm about their work being utilized to train generative AI tools that then produce similar content based upon it.

"We must be clear, when we are discussing information here, we in fact imply human creators' life works," states Ed Newton Rex, creator of Fairly Trained, which projects for AI firms to respect creators' rights.

"This is books, this is articles, this is pictures. It's masterpieces. It's records ... The entire point of AI training is to find out how to do something and then do more like that."

In 2023 a song including AI-generated voices of Canadian singers Drake and The Weeknd went viral on social media before being pulled from streaming platforms because it was not their work and they had actually not granted it. It didn't stop the track's creator trying to nominate it for a Grammy award. And although the artists were fake, it was still wildly popular.

"I do not believe using generative AI for imaginative purposes must be prohibited, however I do think that generative AI for these purposes that is trained on individuals's work without approval should be banned," Mr Newton Rex adds. "AI can be extremely effective but let's build it ethically and relatively."

OpenAI says Chinese rivals utilizing its work for their AI apps

DeepSeek: The Chinese AI app that has the world talking

China's DeepSeek AI shakes industry and damages America's swagger

In the UK some organisations - including the BBC - have selected to block AI designers from trawling their online material for training functions. Others have actually chosen to team up - the Financial Times has partnered with ChatGPT creator OpenAI for instance.

The UK federal government is considering an overhaul of the law that would permit AI developers to utilize developers' content on the web to help develop their designs, unless the rights holders decide out.

Ed Newton Rex explains this as "madness".

He explains that AI can make advances in locations like defence, healthcare and logistics without trawling the work of authors, journalists and artists.

"All of these things work without going and altering copyright law and messing up the incomes of the nation's creatives," he argues.

Baroness Kidron, a crossbench peer in the House of Lords, is likewise strongly against removing copyright law for AI.

"Creative industries are wealth creators, 2.4 million jobs and an entire lot of delight," states the Baroness, who is likewise a consultant to the Institute for Ethics in AI at Oxford University.

"The government is undermining among its best performing industries on the vague promise of development."

A government representative said: "No move will be made up until we are definitely positive we have a useful plan that provides each of our objectives: increased control for right holders to assist them license their material, access to top quality product to train leading AI designs in the UK, and more openness for best holders from AI developers."

Under the UK federal government's brand-new AI strategy, a nationwide data library containing public data from a vast array of sources will also be offered to AI scientists.

In the US the future of federal guidelines to manage AI is now up in the air following President Trump's go back to the presidency.

In 2023 Biden signed an executive order that intended to enhance the safety of AI with, among other things, companies in the sector required to share information of the functions of their systems with the US government before they are released.

But this has now been reversed by Trump. It remains to be seen what Trump will do rather, however he is said to want the AI sector to deal with less guideline.

This comes as a variety of suits against AI companies, and especially against OpenAI, continue in the US. They have been taken out by everybody from the New York Times to authors, music labels, and even a comedian.

They claim that the AI firms broke the law when they took their material from the web without their authorization, and utilized it to train their systems.

The AI business argue that their actions fall under "reasonable usage" and are therefore exempt. There are a variety of factors which can make up fair use - it's not a straight-forward meaning. But the AI sector is under increasing scrutiny over how it gathers training data and whether it need to be spending for it.

If this wasn't all sufficient to contemplate, Chinese AI firm DeepSeek has shaken the sector over the previous week. It ended up being the many downloaded complimentary app on Apple's US App Store.

DeepSeek declares that it developed its innovation for a portion of the cost of the similarity OpenAI. Its success has actually raised security concerns in the US, and threatens American's present dominance of the sector.

When it comes to me and a career as an author, I think that at the minute, if I really desire a "bestseller" I'll still have to write it myself. If anything, Tech-Splaining for Dummies highlights the current weakness in generative AI tools for bigger projects. It has lots of mistakes and hallucinations, and it can be rather hard to check out in parts since it's so long-winded.

But given how rapidly the tech is progressing, I'm not sure the length of time I can stay positive that my considerably slower human writing and modifying abilities, are better.

for our Tech Decoded newsletter to follow the biggest developments in international innovation, with analysis from BBC correspondents worldwide.

Outside the UK? Sign up here.