Newspaper chain Gannett has paused the use of an artificial intelligence tool to write high school sports dispatches after the technology made several major flubs in articles in at least one of its papers. Several high school sports reports written by an AI service called LedeAI and published by the Columbus Dispatch earlier this month went viral on social media this week — and not in a good way. In one notable example, preserved by the Internet Archive’s Wayback Machine, the story began: “The Worthington Christian [[WINNING_TEAM_MASCOT]] defeated the Westerville North [[LOSING_TEAM_MASCOT]] 2-1 in an Ohio boys soccer game on Saturday.” The page has since been updated. The reports were mocked on social media for being repetitive, lacking key details, using odd language and generally sounding like they’d been written by a computer with no actual knowledge of sports. CNN identified several other local Gannett outlets, including the Louisville Courrier Journal, AZ Central, Florida Today and the Milwaukee Journal Sentinel, that have all published similar stories written by LedeAI in recent weeks. Many of the reports feature identical language, describing “high school football action,” noting when one team “took victory away from” another and describing “cruise-control” wins. In many cases, the stories also repeated the date of the games being covered multiple times in just a few paragraphs. Gannett has paused its experiment with LedeAI in all of its local markets that had been using the service, according to the company. The pause was earlier reported by Axios. “In addition to adding hundreds of reporting jobs across the country, we are experimenting with automation and AI to build tools for our journalists and add content for our readers,” a Gannett spokesperson said in a statement. “We are continually evaluating vendors as we refine processes to ensure all the news and information we provide meets the highest journalistic standards.” LedeAI CEO Jay Allred expressed regret that articles produced for Gannett newspapers “included some errors, unwanted repetition and/or awkward phrasing,” adding that the company “immediately launched an around-the-clock effort to correct the problems and made the appropriate changes.” “There were legitimate problems with the reports we produced and the feedback we received was valid,” Allred said in a statement to CNN. But, he added: “We believe content automation is part of the future of local newsrooms … Our service provides readers and communities with information they would not otherwise have, and frees reporters and editors to do real journalism that drives impact in the communities they serve.” As of Wednesday, several Dispatch sports stories written by the service had been updated and appended with the note: “This AI-generated story has been updated to correct errors in coding, programming or style.” The AI tool debacle comes after Gannett axed hundreds of jobs in December when it laid off 6% of its news division. It also comes as many news outlets grapple with how to handle the rapid advancement of AI technology. CNET earlier this year also paused an experiment using AI to write stories after it was forced to issue multiple corrections on AI-generated reports. Meanwhile, some other outlets have blocked access to software from OpenAI, the maker of ChatGPT, in an effort to prevent their content from being used to train its AI models.