From the GPS in our cars to interacting with a chatbot while ordering groceries, expert says AI continues to ingrain itself into our society
WV Press Association Report
CHARLESTON, W.Va. – “We need to be able to tell snake oil from legitimate technological innovation. We really all need to be informed about these [Artificial Intelligence] tools because they are at use in our lives – right down to what you see at the top of your social media feed.”
That’s what West Virginia University’s (WVU) Amy Beth Cyphert said regarding her upcoming appearance at the WV Press Association’s annual convention, Aug. 11-12 in Charleston.
Cyphert, who serves WVU as both Lecturer in Law, and director of the university’s ASPIRE Office, will be presenting an educational-seminar focusing on the “ethics and implications” of the use of Artificial Intelligence (AI) in communications.
“If we understand (AI), and we’re able to participate in the policy discussions, then I think the future can be much better,” Cyphert noted. “I believe that technology can be a great thing, and that it can work for us – but it can’t work for us if we don’t understand it and we’re not all at the table.”
“We cannot just hope that large tech-companies will act in our collective best interests,” Cyphert added. “We’ve got to be a part of the conversation.”
A 2005 graduate of Harvard Law School, Cyphert has spent over five years researching the impact of AI on law practiced within the United States. Much of Cyphert’s research has dealt with the study of “algorithmic amplification,” affording her a unique insight into the relationship between AI and its use in the media.
Cyphert is the author of several papers on the topic of AI, including “A Human Being Wrote This Law Review Article: GPT-3 and the Practice of Law,” and “A Change is Gonna Come: Developing a Liability Framework for Social Media Algorithmic Amplification.” In the former – originally published in Nov. 2021 – Cyphert outlines the implications and potential dangers of an AI with the capability of fooling readers into believing that it is human. In the latter, Cyphert discusses the legal immunity presently afforded to social media companies.
“Once that immunity is altered, either by Congress or by the courts, these companies may be liable for the decisions and actions of their […] artificial intelligence models that sometimes amplify the worst in our society,” Cyphert wrote.
From using the GPS in our cars, to interacting with a chatbot while ordering groceries, AI continues to ingrain itself into our society. However, the rise of “large language models” – such as ChatGPT – brings a degree of uncertainty surrounding a journalist’s place in the newsrooms of the future.
“I don’t want journalists replaced by large language models, that would be a nightmare for a lot of reasons,” Cyphert said. “Do I think those large language models can help journalists with their work? Yes, and that’s something that newsrooms should be discussing. But [journalists should] never be replaced. You need that human in the loop because of all the problems with bias in the tools, because of the tools’ propensity to make up information – something we call ‘AI hallucinations.’”
Cyphert referenced a recent study – one which she plans to discuss in detail during her appearance at the WV Press Association’s convention – that highlights the difficulties many readers experience when trying to determine if content was written by a human, or a large language model AI.
“Here’s the part that is really scary, humans are more likely to believe disinformation and misinformation when it is written by a large language model than when it is written by a human,” Cyphert noted. “We don’t know exactly why yet, but we know that was the finding of at least one study. That should give us all pause.”
According to Cyphert, the tools – while having the potential to provide a savings of both time and cost – are only as effective as the human utilizing them.
“ChatGPT passed the bar exam,” Cyphert noted, before adding that, “Lawyers have already gotten themselves into trouble by trying to use it to replace themselves.”
“There is so much human judgment and instinct required to practice law, and, I would argue, to be a journalist,” Cyphert said. “There are so many problems right now with these tools in terms of bias and mistakes, that it would not only be dangerous and irresponsible, but you would potentially be liable for choosing to have a tool that makes big mistakes and puts out the wrong stuff.”
“I use the word ‘tool’ intentionally,” Cyphert continued. “A tool doesn’t replace a carpenter – a tool can make a carpenter better, faster, more efficient – but you still need the carpenter. We’re not talking about automation, we’re talking about tools.”
“I would be remiss if I didn’t say that, if, as a reporter, a tool makes you faster and more efficient, then it is possible that instead of needing 10 (reporters), you need nine,” Cyphert added. “I think we all have to be clear-eyed and realistic about that, but I don’t think we’re anywhere near that yet.”
Cyphert’s seminar is scheduled for 10:30 a.m., on Saturday, Aug. 12. For more information regarding the 2023 WV Press Association Convention, including a full schedule of events and details on how to register, visit wvpress.org.