Connect with us

Hi, what are you looking for?

Business

Gibberish from the machine

I am deeply honored that Germany’s Stern magazine invited me to write about the intersection of AI and journalism for their 75th anniversary edition. As I worked on my piece, I came across a new word: Kauderwelsch, which means gibberish in German. It reminded me of the impact of Gutenberg’s invention of the printing press, which transformed public discourse, creativity, and news into a commodity known as “content.” This mindset has led to the belief that the primary value of journalists lies in creating content to fill the ever-growing demand of the internet. Some online news sites even impose content quotas on their reporters, and traditional news organizations have replaced the role of editor-in-chief with “chief content officers.”

But now, we are facing a new challenge in the form of generative artificial intelligence, specifically large language models (LLMs) like ChatGPT. These machines are trained on vast amounts of text and can generate content that sounds just like us. However, they have no understanding of the words they use or the concept of truth. Their only purpose is to predict the next most likely word in a sentence.

This was made evident in a now-infamous case involving a New York lawyer named Steven Schwartz. He asked ChatGPT for precedents in a lawsuit involving an airline snack cart and his client’s alleged knee injury. ChatGPT provided him with several citations, but when Schwartz’s firm filed the legal brief in court, the opposing counsel could not find the cases. The judge, P. Kevin Castel, ordered Schwartz to produce the cases, and he turned to ChatGPT once again. The machine confirmed that the cases were real, but when Schwartz asked for the complete versions, they turned out to be nonsensical. The judge called them “gibberish” and ordered Schwartz and his colleagues to explain themselves in court.

As a journalist, I was present to witness the humbling of these attorneys at the hands of technology and the media. The lawyers’ lawyer acknowledged the dangers of relying on ChatGPT for legal research and praised the judge for warning the public about these risks. However, Judge Castel clarified that his intention was not to expose the flaws of the technology but to hold the lawyers accountable for their negligence.

In conclusion, the problem here was not with the technology itself, but with the lawyers who misused it and ignored warnings about its limitations. As journalists, it is our responsibility to educate the public about the potential pitfalls of AI and to hold those who use it accountable for their actions. 

You May Also Like

Tech

In an era of increasing digitalization, the Human Machine Interface (HMI) takes center stage as the linchpin of our interaction with technology. It serves...

Tech

The preview of Nintendo Switch 2 innovations excites gamers worldwide. This preview promises cutting-edge features, enhancing interactive experiences. Nintendo’s preview hints at a transformative...

Business

The Importance of Sales Leadership Sales leadership plays a crucial role in driving business growth and success. Effective sales leaders have the ability to...

News

The announcement followed a third unsuccessful attempt to free the stranded cruise liner. The Australia-based Aurora Expeditions, operator of the MV Ocean Explorer, stated...