Developers who were on the fence about the impact of generative AI on game quality are growing pessimistic.
At a Glance
- New data suggests developers are four times more likely to say generative AI will reduce the quality of games.
- The shift in opinion occurred among developers who were 'neutral' on the subject.
- Industry experts agree pressure to use these tools—despite their flaws—is driving this shift.
Concerns over the negative impact of generative AI are spreading among game developers. According to new data from the Game Developer Collective, developers are four times more likely to say generative AI technology will negatively impact the quality of games than they were a year ago. It's a rising concern from a community already wary of the technology due to its intense energy usage and lingering questions over the ethics and legality of training data.
The statistics—gathered by our peers at Omdia by surveying members of the Game Developer Collective panel—shows a 13 percent increase in the number of developers concerned over product quality. The number rose from 34 percent in 2024 to 47 percent in 2025. Fewer developers expressed positivity about the technology, dipping from an already-scant 17 percent in 2024 to 11 percent in 2025.
The number of respondents who said they were "neutral" on the topic or thought it would have no impact on quality also decreased.
We reached out to experts across the game industry to quiz them on the shift, casting a wide net to look at the topic from different backgrounds. Necrosoft Games founder and Demonschool game director Brandon Sheffield argued the technology is part of a growing trend of companies "sanding off the edges" of games as an art form. "I'm not just talking about high-minded ideas…I'm talking about something as mechanics-oriented as 'what's a good jump' or 'what does a good assault rifle feel like?'" AI's tendency to replicate content from its training data is the primary weakness Sheffield is concerned with.
"Games using [genAI] will be more generic, even if that's not immediately discernible to a player."
Hidden Door CEO and founder Hilary Mason—herself a developer experienced with machine learning and genAI—expressed a similar sentiment. "Large Language Models (LLMs) are aspirationally mid," she said, saying that the economic pressure of the games business could incentivize some studios to crank out "mediocre or even slop games" without "care and polish."
King's College senior lecturer Mike Cook suggested that the shift in perception may not be about the assessment of the tools, but a judgment on the companies using them. "There was a perception that companies didn't really want to use it, and that the tools weren't reliable."
"What we've seen over the past twelve months…is that a lot of companies are using these tools despite their problems and despite the public backlash," he continued. "I suspect a lot more developers are facing situations where they're being asked to make do with AI outputs as 'good enough,' especially when the industry continues to get squeezed and more people lose their jobs."
Can AI toolmakers win over game developers?
If AI toolmakers want to get back in developers' good graces, they might want to lean into the technology's flaws instead of handwaving them away.
The Games Fund founder and managing partner Ilia Eremeev (who began his career as a 3D artist, informing his team's decision to invest in Layer.ai) agreed that exposing players to AI-generated content runs the risk driving down perceptions of quality. But he said this trend also shows generative AI is being used in the wrong fields. "There is a lot of unnecessary labor in asset variations, retopology, rigging, localization, QA, code assistance, even design research," he said, arguing that AI can "dramatically speed things up."
He's sour on the idea of a "text-to-game" pipeline like the kind Electronic Arts showed off in its 2024 presentation to investors. "Describing complex systems in natural language is often harder than building them. You either end up with very abstract results or waste more time than you save."
Mason's betting that generative AI has a purpose in making "new kinds" of games—though she makes a more modest pitch for Hidden Door's interactive roleplaying platform, which generates tabletop-like roleplaying prompts that echoes an author's prose. "It's an old game style, now possible in a new form," she said. She and Eremeev both suggested that discoverability tools on game platforms will play a large role in shaping players' views on AI in games.
For companies who want their workers to eagerly adopt AI tools—maybe you'd benefit by not "forcing" them on your workers. Cook and Sheffield both echoed with this idea, with the King's College lecturer pointing out that the game industry "doesn't have a good track record of making decisions based on what's best for a game, or listening to feedback from their employees."
"Forcing things on people has never really yielded amazing results," Sheffield quipped. "But this seems to be the only idea [companies] like Microsoft have."
Game Developer and Omdia are sibling organizations under Informa.
Update 9/16: This story has been updated with the correct spelling of Ilia Eremeev's first name.