Inside Arlington is an online news site with no gaudy photos, no snappy headlines — and almost no humans.
The site uses artificial intelligence systems to cover school committee meetings, real estate transactions, and other municipal affairs in the Boston suburb of Arlington. It’s work once assigned to the lowliest of cub reporters. But small-town papers that once provided such coverage are either long gone or too short-staffed to provide this kind of meat-and-potatoes journalism.
“The town of Arlington, for practical purposes, is a news desert,” said town resident and software developer Winston Chen. So, he teamed up with fellow Arlington resident and veteran foreign correspondent David Trilling to launch Nano Media, a nonprofit company that’s built a digital toolkit to automate local reporting. If the system proves its worth in Arlington, Chen and Trilling plan to offer their technology at low cost to news-starved communities throughout the US.
“We want the technology to be scalable enough for anybody to raise their hand and say, I want to start this kind of news outlet for my town,” said Chen. “We call this local news in a box.”
It’s an uneasy thought for human journalists who’ve already seen digital technology ravage the industry. About 2,500 US newspapers have closed since 2005, most of them in suburbs and small towns, according to the Medill School of Journalism at Northwestern University. The Pew Research Center estimates that 57 percent of newspaper journalists lost jobs between 2008 and 2020, mostly because newspapers lost billions in advertising dollars to Google and other online giants.
But it’s unclear whether computer-generated news stories will cost still more newspaper jobs. After all, the technology isn’t new.
Wire services Reuters and the Associated Press have cranked out computerized news for years, mainly for simple stuff like high school and college sporting events or corporate financial reports. The wire services mainly use the technology to free up human journalists to cover more complex and demanding stories.
These programs generally work by ingesting raw data, such as box scores or company press releases provided by corporations or sports teams. They don’t use well-known AI programs such as ChatGPT to generate the stories. Instead, these systems combine the stats with text templates created ahead of time by humans to make the results more readable.
It doesn’t always work. Lede AI, an Ohio company that offers AI tools for producing local sports stories, became a national laughingstock last month after client newspapers in different cities served up sports stories that repeatedly used the same turns of phrase, such as “a close encounter of the athletic kind.” The Gannett newspaper chain, where many of the odd articles appeared, said it has “paused” its use of the Lede AI system.
“As with any new technological advance, some glitches can occur,” said a statement issued by Lede AI. “We immediately launched an around-the-clock effort to correct the problems and made the appropriate changes.”
Inside Arlington is taking the technology a step further. It tracks down city news with software that regularly scans government websites and captures the latest reports. It also monitors the city’s YouTube channel to obtain video recordings of public meetings. Then it uses voice-to-text transcription software to generate a written record of each meeting. Suddenly there’s no need to send a reporter to the school board meeting, because computers can cover it.
“I think a lot of this would be impossible without COVID,” Trilling said, because it was the pandemic that drove many cities to host public meetings online, where computers can listen in.
Having captured the texts, Inside Arlington relies on ChatGPT software to write the summaries. The humans double-check the results to weed out obvious errors. “All the editor has to do is go in, read it, make sure there isn’t something embarrassing and push ‘publish,’” Trilling said.
But are the resulting stories truly journalism? Former journalism professor Jeff Jarvis has his doubts. Jarvis, a cofounder of Entertainment Weekly magazine and author of “The Gutenberg Parenthesis,” warns that AI software frequently makes up stories with no connection to reality. For example, the technology news site CNET earlier this year had to post corrections to dozens of AI-generated articles that contained blatant factual errors.
AI systems have no conception of truth, warned Jarvis. “It’s irresponsible to tie that to an activity where there’s an expectation of fact and truth.”
Ethan Zuckerman, director of the Initiative for Digital Public Infrastructure at the University of Massachusetts at Amherst, warns of an even more difficult problem. A competent human journalist understands his topic, but AI programs don’t. They just use statistics to string words together, creating stories devoid of context or human insight.
“Trying to get meaningful journalism out of this is one of the most difficult challenges we can imagine,” said Zuckerman.
For instance, if a city official said something shocking or controversial during a hearing, a human reporter would highlight it in his story. AI wouldn’t notice and just write up a bland summary of the meeting that would read like a “pile of unseasoned broccoli,” Zuckerman said.
Sure enough, Inside Arlington’s stories are little more than a compressed litany of facts, unseasoned by insightful analysis, historical context, or a trace of wit. In short, the site makes for deadly, dull reading.
Still, it provides Arlington residents with useful information about city government — information that’s otherwise quite hard to come by.
Chen and Trilling say they’re aware of AI’s limitations. Serious journalism requires in-depth research and human interactions that are beyond any computer’s abilities, they say. “At the end of the day, we’re still going to need full-time journalists,” said Chen.
But perhaps not quite so many of them.