Learning Center

AI Companies Staff Up PR Teams as Image Crisis Deepens

February 20, 2026

Show Editorial Policy

shield-icon-2

Editorial Policy

All of our content is generated by subject matter experts with years of ad tech experience and structured by writers and educators for ease of use and digestibility. Learn more about our rigorous interview, content production and review process here.

AI Companies Staff Up PR Teams as Image Crisis Deepens
Ready to be powered by Playwire?

Maximize your ad revenue today!

Apply Now

OpenAI, Google, Anthropic, and Perplexity are recruiting senior PR professionals at unprecedented rates. OpenAI alone has tripled its communications team since January, while The industry is seeing a huge uptick in AI reputation management roles.

The catalyst? Public trust in AI companies dropped in the past six months, driven by copyright lawsuits, unauthorized content scraping, and publisher revenue concerns. Perplexity faced particular scrutiny after publishers discovered the company was ignoring robots.txt files and scraping content without permission.

Translation: These companies know they have a credibility crisis, and they're spending serious money to fix it.

Publishers See Traffic Drop from AI Answers

The reputation battle matters because AI companies are reshaping how users consume content. Google's AI Overviews now appear in 84% of search results, while Perplexity handles over 100 million queries monthly—often providing answers without sending traffic to original sources.

Publishers with 5 million monthly visitors report losing an average of 1.15 million visits to AI-powered search features. That translates to roughly $23,000 in monthly ad revenue for sites earning $20 RPM. The math gets brutal quickly: A 23% traffic decline compounds month over month as users adapt to getting answers directly from AI tools.

Here's what matters: AI companies are staffing up PR teams because they need publisher cooperation for training data and legitimacy. But they're simultaneously building products that reduce publisher traffic.

Robots.txt Updates Won't Wait for Better PR

While AI companies polish their messaging, publishers need immediate action. Most major AI crawlers—including GPTBot, Bard, and Claude—can be blocked through robots.txt updates, but implementation varies significantly.

Publishers should audit their current crawler protection within the next 30 days. OpenAI is always advancing new partnerships, meaning existing scraping agreements could change. Companies with weak crawler protection risk losing negotiating leverage as AI firms seek "legitimate" content sources.

The catch: Blocking crawlers might reduce future AI traffic referrals, but allowing unrestricted access guarantees continued content scraping without compensation. Publishers need clear blocking strategies before these companies' improved PR convinces users that unauthorized scraping is acceptable.

AI Companies Will Keep Scraping Despite Better Messaging

Expect AI companies to announce more publisher partnerships and revenue-sharing deals as their PR campaigns intensify. But fundamental business models won't change—these firms need vast content libraries to train increasingly sophisticated models.

The real test isn't whether AI companies hire better communicators, but whether they'll pay fair licensing rates for publisher content. Early signs suggest they prefer PR solutions over revenue-sharing solutions.

Publishers can evaluate their current AI crawler protection and identify revenue risks with Playwire's AI Crawler Protection.

New call-to-action

Editorial Disclosure

This article was produced with AI assistance and reviewed by the Playwire editorial team. News sources are cited where applicable. Playwire is committed to providing accurate, timely information to help publishers navigate the digital media business. For questions about our editorial process or to suggest topics for future coverage, contact our team.