Friday, March 20, 2026

Major Sites Say No to Apple’s AI

Share

In a separate analysis this week, data journalist Ben Welsh found that just over a quarter of the news sites he studied (294 of 1,167 mostly English-language publications based in the US) block Applebot-Extended. By comparison, Welsh found that 53 percent of the news sites in his sample block the OpenAI bot. Google introduced its own AI-specific bot, Google-Extended, last September; it’s blocked by nearly 43 percent of those sites, a sign that Applebot-Extended may still be flying under the radar. But as Welsh tells WIRED, that number has been “gradually increasing” since he started looking.

Welsh has project in progress monitoring how news organizations approach major AI agents. “There’s been some division among news publishers about whether they want to block these bots,” he says. “I don’t have an answer as to why every news organization has made that decision. Of course, we read about a lot of them entering into licensing agreements where they get paid to let bots in — maybe that’s a factor.”

Last year, the Modern York Times reported that Apple has tried to strike AI deals with publishers. Since then, competitors like OpenAI and Perplexity have announced partnerships with news outlets, social media platforms, and other popular websites. “A lot of the world’s largest publishers are clearly taking a strategic approach,” says Originality AI founder Jon Gillham. “I think in some cases it’s a business strategy — like holding back data until they can strike a partnership.”

There’s some evidence to support Gillham’s theory. For example, Condé Nast’s sites blocked OpenAI’s web crawlers. After the company announced its partnership with OpenAI last week, it unblocked the company’s bots. (Condé Nast declined to comment on the record for this story.) Meanwhile, Buzzfeed spokeswoman Juliana Clifton told WIRED that the company, which is currently blocking Applebot-Extended, puts any AI web crawler it identifies on its blocklist unless its owner has entered into a partnership—usually a paid one—with the company that also owns the Huffington Post.

Because the robots.txt file must be edited manually and so many up-to-date AI agents are debuting, it’s tough to keep the blocklist up to date. “People just don’t know what to block,” says Murky Visitors founder Gavin King. Murky Visitors offers a freemium service that automatically updates a client’s site’s robots.txt file, and King says publishers make up a significant portion of his client base because of copyright concerns.

Robots.txt may seem like arcane territory for webmasters—but given its enormous importance to digital publishers in the AI ​​era, it’s now the domain of media executives. WIRED has learned that two CEOs of major media companies are directly deciding which bots to block.

Some media outlets have made it clear that they block AI scrapers because they don’t currently have partnerships with their owners. “We block Applebot-Extended across all Vox Media properties, as we have done with many other AI scrapers when we don’t have a commercial agreement with the other party,” says Lauren Starke, senior vice president of communications at Vox Media. “We believe in protecting the value of our published work.”

Latest Posts

More News