🧪 Why game publishers are pushing back hard against generative AI.
🗞️ AI's boiling frog effect, hiring data, and Gemini lands on Mac.
📊 Monday Poll on who actually controls AI in gaming.
💡 Roko's Pro Tip on why sales data beats the discourse.
Let’s dive in. No floaties needed…

When markets get shaky, advisors don’t just manage portfolios. They manage fear, questions, follow-up and a flood of client communication.
That’s where weak delegation gets expensive.
If meeting prep, paperwork, CRM updates and account admin still run through you, response times slip and the client experience takes the hit.
BELAY created the free Financial Advisor’s Delegation Guide to help you identify what to hand off, what to keep and how to stay client-facing without losing control.
Inside, you’ll learn how to reduce bottlenecks, protect responsiveness and free up more time for the work only you should be doing.
*This is sponsored content

Creative backlash is surging: 52% of game workers now say generative AI is harming their industry, up from 18% two years ago, with opposition strongest among artists, designers, and writers.
Steam is drowning in AI slop: Nearly 8,000 titles disclosed AI use in H1 2025 alone, an eightfold jump from all of 2024, making discoverability a real commercial problem for human-made games.
Placeholder AI keeps shipping by accident: Crimson Desert, The Alters, and others launched with AI assets meant to be replaced, prompting Hooded Horse to ban AI content outright in publishing contracts.
Player wallets may override outrage: Crimson Desert still sold 2M copies on day one despite the controversy, suggesting purchasing behavior will decide AI adoption more than policy.
The video game industry has long been a space where developers and creatives come together to build immersive worlds shaped by storytelling and design. But its significance now extends far beyond creative expression. It has grown into a commercial powerhouse, with the global market estimated at around $188.9B in 2025, making it one of the largest entertainment industries in the world.
That scale, combined with generative AI’s ability to accelerate both software development and creative production, has made gaming a natural testing ground for the technology.
Yet, like many industries being reshaped by generative AI, gaming has responded with resistance. The resistance stems from a lack of clarity about who controls creative labor and the role of human judgment in a medium that has historically depended on it.
Recently, the tension led Mike Rose, founder of indie publisher No More Robots, to declare that, from a publisher’s perspective, generative AI is “mega annoying” and that “video games are cooked,” pointing to the flood of AI-generated content overwhelming storefronts like Steam.
Rose’s frustration encapsulates the forces at play within an industry simultaneously facing challenges from technology, economics, and cultural shifts.
His reaction to the use of generative AI, though, represents only one side of the industry story.
Even before artificial intelligence became one of the driving forces in technology, it was being used in video game development for jobs like pathfinding, enemy behavior, and procedural terrain generation without controversy.
What has changed now is the nature of the tools and, more specifically, where their outputs end up in the finished product. This means that the current backlash is directed more towards AI that produces player-facing content, including the art hanging on a virtual wall, the voice of a non-player character, or the texture of a landscape that a studio’s artists spent months designing by hand, than the deeper use of technology to develop the underlying world the players inhabit.
This undercurrent is measurable and reflected in the 2026 State of the Game Industry report’s findings. According to the report, 52% of game industry workers now believe generative AI is harming their industry, up from 30% just one year earlier and 18% two years earlier.
Workers in visual art, game design, and narrative hold the most unfavorable views, with opposition above 60% in each discipline, and only 7% of all respondents said generative AI was having a positive effect, down from 13% the prior year. The people who are most skeptical of the technology are, overwhelmingly, the ones closest to the creative work it threatens to automate.
Besides the tensions in the creative field, gen AI is also impacting how games are marketed, discovered by players, and distributed.
Rose’s complaint about discoverability speaks to a concrete commercial threat that publishers are already feeling. During the last Steam Next Fest, he observed that roughly one-third of the demos appeared to feature AI-generated key art or in-game content. This makes it materially harder for publishers investing in human-made games to attract attention.
Nearly 8,000 titles released on Steam in the first half of 2025 disclosed the use of generative AI, an eightfold increase from roughly 1,000 disclosures across all of 2024, according to VGC. Since disclosure is voluntary, the actual number is almost certainly higher.
Valve has tried to bring structure to the situation by rewriting its AI disclosure form in January 2026 to distinguish between tools used during development and content that reaches the player. But enforcement remains based on good faith, and the Epic Games Store has no disclosure requirements at all, with CEO Tim Sweeney arguing that such labels are becoming meaningless because AI will soon touch nearly every part of game production.
To put it in plain words, the very nature of gen AI, which fast-tracks game development, is overwhelming the market, leaving human-generated games to lose players due to sheer volume. This is akin to human-generated content struggling to find the right audience on social media platforms amidst an onslaught of AI-generated slop.
Beyond discoverability, publishers face a more immediate problem: AI assets that were never supposed to reach players are making their way into shipped games with increasing regularity, and the resulting apology cycle has become a recurring feature of industry news.
In March 2026, the open-world RPG Crimson Desert launched with 2M copies sold on its first day, before players began identifying AI-generated paintings in its environments. Developer Pearl Abyss issued an apology, explaining that some visual props had been created with “experimental AI generative tools” during early development and were supposed to be replaced before launch.
The game’s Steam page was also updated to include an AI disclosure only after the controversy emerged. The same pattern played out with 11 Bit Studios’ The Alters, Ubisoft’s Anno 117: Pax Romana, and Sandfall Interactive’s Clair Obscur: Expedition 33, which was stripped of its Game of the Year award from the Indie Game Awards after placeholder AI textures were discovered in the shipped build.
Tim Bender, CEO of indie publisher Hooded Horse, has responded by adding a blanket ban on AI-generated assets to his company’s publishing contracts, calling generative AI “cancerous” and advising developers not to use it at any stage of production due to the risk that placeholders will slip into the final product.
However, enforcement remains a challenge, as there are no reliable automated tools for detecting generative AI content in a finished game, leaving the ban to depend entirely on trust and manual vigilance across the entire production chain.
And even as the industry continues to grapple with these issues, the rapid pace of AI development, backed by big-ticket investors, paints a bleak picture.
The recently released NVIDIA DLSS 5 changes a game after it’s already been released, at the rendering layer, without direct developer involvement.
While earlier versions of DLSS were widely accepted because they improved performance while preserving the developer’s original visual intent. DLSS 5, unveiled at GTC 2026, uses what NVIDIA calls 3D-Guided Neural Rendering. This means it can generate new lighting and material effects in real time.
This means that even if developers refrain from using AI during development, the technology can still be used to make visible changes to the game after it is released to the public.
A live example of this was seen during demos of Resident Evil Requiem, where players noticed visible changes to the protagonist’s face, with smoother features that some compared to AI beauty filters.
The backlash was immediate, with criticism from both players and developers, some of whom said they hadn’t been consulted. NVIDIA CEO Jensen Huang dismissed the concerns, but the debate goes beyond any single feature. It raises a larger question: should a hardware company be able to alter the look of a game after its creators are done with it?
The executive-developer divide on this issue is widening, with structural consequences for the entire gaming industry. A GDC survey showed that only about 30% of game developers use AI tools, compared to 58% in publishing and marketing. At the same time, executives tend to be far more optimistic about the technology than the people actually making games.
At Take-Two Interactive, CEO Strauss Zelnick said the company was running hundreds of AI experiments, even as GTA 6 was built without it, and soon after, the company cut parts of its AI team. Meanwhile, others are taking a more cautious approach, avoiding AI-generated assets in games while using AI internally to improve workflows.
All this is happening while job insecurity is shaping how developers see AI. Around 28% of GDC respondents said they’ve been laid off in the past two years, and a large majority of U.S. developers now support unionization. Legal uncertainty adds another layer, with dozens of copyright lawsuits against AI companies still unresolved, making AI-generated content a potential risk for studios.
The video game industry, then, is facing challenges on all fronts. The changing technological landscape drives some, while others are driven by the lack of clarity about how AI will continue to impact legal positioning and workflows. Despite these, the industry can continue to rely on a single metric that could help it decide the path it takes from here.
The path decided by player behavior will matter more than policies or lawsuits. Despite controversy over its use of AI, Crimson Desert still sold 2M copies, suggesting that what players say about AI and what they actually buy may not always align. That gap could determine how quickly studios adopt the technology.


![]() | 💡What players buy matters more than what they say. Watch the sales data, not the disclosure. |

Want to know what world-class talent actually costs in 2026?
Athyna's Salary Report breaks down real salary data across AI, Tech, Data, Design, and more—so you can see exactly where the savings are.
The numbers might surprise you.
*This is sponsored content

The boiling frog problem: A study found that just 10 minutes of AI-assisted problem-solving makes people worse at independent thinking once the AI is removed and less willing to try.
AI hasn't killed hiring. Yet, LinkedIn's data shows hiring is down 20% since 2022, but it blames interest rates, not AI. The caveat: job skills are expected to change 70% by 2030.
Gemini lands on Mac: Google launched a native Gemini Mac app, the last of the big three to arrive on desktop, with screen sharing, keyboard shortcuts, and a free tier.

🗳️ AI in gaming: who actually decides? |

Regie: AI-powered sales copilot to draft, personalize, and refine outbound messaging.
Replit: In-browser AI coding environment for writing, debugging, and running apps fast.
Sourcegraph: A code intelligence layer that lets AI search, understand, and refactor huge codebases.

What did you think of today's email? |
