The New York Times Drops Freelancer After AI-Generated Book Review Copies Guardian
Author Alex Preston admits to using an AI tool that 'dropped in' text from a previously published Guardian critique

The New York Times has terminated its relationship with freelance journalist Alex Preston following an AI-driven plagiarism incident.
An internal investigation confirmed that Preston used an artificial intelligence tool to draft a review of the novel Watching Over Her by Jean-Baptiste Andrea. The resulting text contained phrases and thematic observations nearly identical to those in a review by Christobel Kent, published by The Guardian in August.
A reader tip-off prompted The Times to issue a formal editor's note on 30 March 2026. The publisher cited reliance on AI and the use of unattributed work as clear violations of editorial standards. Preston has since apologised to both publications and the original reviewer.
Reader Alerted NYT
British author and journalist Alex Preston, who has written for publications including The Observer, Financial Times, The Economist, Harper's Bazaar, and The Observer's New Review, expressed regret to The New York Times. 'I am hugely embarrassed by what happened and truly sorry,' he told The Guardian. He also apologised to the original reviewer, Christobel Kent, and to The Guardian.
The issue came to light when a reader noticed these similarities and flagged it by alerting The New York Times, whose swift action led to an immediate investigation that revealed that Preston had utilised an AI tool during the drafting of the review.
In an editor's note appended to Preston's published review dated 30 March 2026, The Times explained, 'A reader recently alerted The Times that this review included language and details similar to those in a review of the same book published in The Guardian. We spoke to the author of this piece, a freelancer reviewer, who told us he used an A.I. tool that incorporated material from the Guardian review into his draft, which he failed to identify and remove. His reliance on A.I. and his use of unattributed work by another writer are a clear violation of The Times's standards.'
Preston told The Guardian the incident left him 'hugely embarrassed' as he had 'made a serious mistake,' failing to identify the segments the tool had pulled from the piece on The Guardian's review.
In his statement, Preston said, 'I made a serious mistake in using an AI tool on a draft review I had written, and I failed to identify and remove overlapping language from another review that the AI dropped in.'
Similarities In Published Text: Specific And Structural
SCOOP: The New York Times cut ties with a freelance book review author after it found out he used AI to help draft a review...that pulled from a Guardian review published months prior.https://t.co/phmjU3Iuwk
— Corbin Bolies (@CorbinBolies) March 30, 2026
The passage in question included character descriptions and observations on the novel's themes that were almost identical to those in Kent's review. For instance, Kent described a key character as 'lazy Machiavellian Stefano,' whereas Preston wrote it as 'lazy, Machiavellian Stefano.'
| Category | Christobel Kent (The Guardian) | Alex Preston (The NYT) |
| Character Description | lazy Machiavellian Stefano | lazy, Machiavellian Stefano |
| National Imagery | country of contradictions, battered, war-torn | country of contradictions: battered, divided |
| Setting Description | where circuses spring up on wasteland | where circuses rise on wasteland |
| Thematic Tone | miraculous: an Italy where life is costume | miraculous. This is an Italy where life is performance. |
The similarities in the published text were specific and structural. You can see a direct overlap between the character descriptions and the final evaluation of the novel's themes. The AI tool appeared to lift distinct adjectives and complex metaphors without modification.
The NYT confirmed it has severed ties with the journalist, emphasising he would no longer contribute to the newspaper. Preston, who recently published his sixth book, A Stranger in Corfu, clarified that he had not used AI in any of his other work for the publication.
This case serves as a warning about the risks of using generative technology in professional journalism. It highlights the potential for AI software to scrape existing copyrighted material and present it as original prose.
© Copyright IBTimes 2025. All rights reserved.
























