Site icon Rapid Telecast

Opinion: Don’t panic yet, writers and artists — an AI takeover is not inevitable

Opinion: Don’t panic yet, writers and artists — an AI takeover is not inevitable

Could a bot write this intro? The hype around AI has pushed this question to the center of public discourse. Conversational bots such as ChatGPT and automated image generators such as DALL-E — both built by OpenAI — are popping up everywhere.

Despite a few optimistic case studies of their potential, the current models are limited and the results they produce deeply flawed. But it doesn’t seem to matter that the tech behind AI is not ready for prime time. The models only have to tell a convincing story to the humans signing the checks — and they are.

Microsoft, which has developed its own Bing chatbot, invested $13 billion in OpenAI. Early-stage venture companies poured $2.2 billion into generative AI last year alone, and this year Salesforce announced a $250-million fund to invest in the space. Headlines and institutions alike are declaring AI the future of work, poised in particular to replace writers and artists. That these breathless predictions are outpacing the quality of the tech itself says a lot about our cultural moment — and our long-festering trend of devaluing creative work.

The supposed promise of the AI future is efficient and abundant content. Office workers can now generate entire presentations with a prompt or click. Creative agencies are using image generators to mock up client concepts. Even literary magazines have reported being bombarded with AI-generated submissions, and yes, editors are hitting publish on AI-generated articles and illustrations.

But the AI models have proved time and again that they perpetuate biases, misunderstand cultural context and prioritize being convincing over telling the truth. They draw on data sets of creative work by humans, an approach that might otherwise be labeled plagiarism or data-mining. And the models driving ChatGPT and DALL-E are black boxes, so the data’s origins can’t technically be traced.

Today, these and other models require humans (with their own biases) to train them toward “good” results and then check their work at the end. Because the tools are built for pattern-matching, their results are often repetitive and thin, an aesthetic of similitude rather than invention.

The impetus to replace human workers, then, doesn’t come from slam-dunk capacities of the tech. It stems from years of companies big and small — especially those in publishing, tech and media — turning the screws on creative work to spend ever less on workers.

In today’s financial downturn, even tech companies are cutting costs through mass layoffs (including slashing AI ethics teams) while funding and selling AI tools. But the situation is more dire for the writers, artists and musicians who have been struggling to make a living for a long time.

Pay for writers, editors and illustrators in this country has stagnated over the past two decades. Some nations have started treating art as a public good: Ireland is experimenting with paying artists to make art, and other countries are subsidizing audiences for art. But in the U.S., public funding for the arts is embarrassingly low compared with other wealthy Western nations and dipped further still during the pandemic. Many artists need to migrate from one social media app to another to build an audience for their work and eke out an income.

Meanwhile, ubiquitous streaming subscriptions and algorithmic feeds, with their laser focus on getting the most engagement, have flattened creativity into an infinite scroll of mediocre, repetitive styles — an unsustainable model for original work.

Automation is the next chapter in this tale of ever cheaper content. How much lower can art’s value go? Why pay creative workers living wages when you can program machines to churn out interchangeable content units?

Well, because these models are no substitute for human creative labor. If we want to break out of repetitive molds, strive to unravel biases and build new possibilities, the work must come from humans.

The danger in reducing creative work to widgets for outsourcing is that we lose the steps of reflection and iteration that produce new connections. The language learning models behind chatbots are designed to deliver a single, authoritative response, contracting the world to the span of the information they’ve already been fed.

The human brain, on the other hand, has a unique capacity for recursive processing that allows us to interpret ideas beyond a set of rules. Each step of the creative process — no matter how slow, small or boring — is an expansive act, transporting a concept into a new place and imagining a wider world than what exists today.

An AI takeover is not inevitable, despite what some business and tech leaders say. This is not the first tech hype cycle, and some regulators, unions and artists are already pushing back. In the wake of the crypto collapse, the Federal Trade Commission established an Office of Technology to support enforcement in emerging tech areas, and the agency has released multiple public warnings that false claims about products’ AI capabilities will be challenged.

The Writers Guild of America, which is poised to go on strike, has proposed protections and regulatory standards around the use of AI in script-writing. SAG-AFTRA, the screen actors and TV and radio workers union, has stated that if studios want to use AI to simulate actor performances, they’ll have to negotiate with the union. Some researchers are building tools to protect the work of visual artists from being absorbed into models for image generators, and others have launched open-source systems to highlight biases in AI models.

But the broader call to action is a cultural one: to recognize that creative work is not merely a commodity or content, but a necessary and highly skilled practice that deserves robust funding and support. Creativity is how meaning is constructed in culture. This is a task that can’t be done by machines — and shouldn’t be controlled by the companies that build them. A bot may be able to swiftly write an ending to this story, but we have to ask ourselves: Whose voices do we actually need?

Rebecca Ackermann has written about tech and culture for MIT Technology Review, Slate and elsewhere. Previously, she worked as a designer at tech companies such as Google and NerdWallet.

Stay connected with us on social media platform for instant update click here to join our  Twitter, & Facebook

We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.

For all the latest Technology News Click Here 

Read original article here

Denial of responsibility! Rapidtelecast.com is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – abuse@rapidtelecast.com. The content will be deleted within 24 hours.
Exit mobile version