Skip to main content

AI's biggest triumphs are not AI | This Week in Business

Among Us VR and Candy Crush Saga are billed as success stories for the latest trend, but they don't necessarily belong under the AI umbrella

Sign up for the GI Daily here to get the biggest news straight to your inbox

This Week in Business is our weekly recap column, a collection of stats and quotes from recent stories presented with a dash of opinion (sometimes more than a dash) and intended to shed light on various trends. Check every Friday for a new entry.

When I was in grade school, one of my teachers had a poster on the wall with a quote on it from an author.

Sadly, the name of the writer and the exact wording have faded from my memory, but the sentiment itself is still there, its impression only getting deeper and more clear with each passing year.

QUOTE | "If it makes you cry to read my work, know that I first had to cry to write it."

That quote shaped the way I think about reading and writing. It's more than a way to communicate information; it's a way to take the hodge-podge of thoughts, ideas, and feelings in one person and give them to another. It's a way to understand others, and be understood in turn.

I may never know what it's like to walk a mile in someone else's shoes, to experience their dreams, or to suffer their heartbreaks, but through their writing, I can begin to imagine it, and relate to the panoply of human experiences outside my own. And through my writing, I can begin to make myself understood to them. Not the way I am most commonly understood today – as a collection of data points and demographic categories that can be tracked and cross-referenced to determine which ads would be most likely to push me into a purchase – but as an actual individual with my own unique qualities and faults, my own particular sources for the emotions everyone feels to some degree.

My writing is a way for people to know me better, to create some form of shared understanding in a world that is far too often lonely and alienating. It is the sum of all that I am and have been over more than four decades now. It is me submitting to the mortifying ideal of being known.

There is a person behind these words, and you can learn something about me from them, from the literal meaning of the words, the way they are strung together, the way they cry out for an editor to pinch the bridge of their nose and grumble, "4,000 words? Again? Really?"

Even if I were to use these words insincerely, to lie, or to mistakenly say something that wasn't true, that too would tell you about the person I am. Because there is a person writing these words, and there is a fundamental truth in the process of putting them together that I cannot evade. There is something – someone – behind these words for the reader to engage with.

And this is why I am utterly uninterested in generative AI, at least when it comes to audience-facing elements of creative work. How am I supposed to engage with the meaning of a text if the only meaning behind it is a statistical likelihood that these words in this order will resemble something an actual person would say? How do I sift through a story for meaning and message if there was no consciousness behind it to put meaning there in the first place?

Generative AI can certainly put words in the mouth of an NPC. Unlimited words, perhaps words that will never repeat in the same way twice. And for the moment, let's not even worry about the non-trivial design questions of integrating such a thing into a game and just say it will basically work as advertised.

Those words are either plagiarized, or they are not actually coming from a person. There is nobody behind those words for me to connect with. They are not capturing anybody's fundamental truth for me to understand. Those words are hollow, representing nothing more than a company asking me to empathize with a remixed and diluted echo of actual humanity.

How am I expected to feel anything about creative output if no feeling ever went into creating it?

Why would I ever spend my precious and finite time reading words that literally nobody cared enough to write?

Already implemented AI

All of that is not to say the current AI trend will never take off.

After all, plenty of people don't hold reading or writing as especially worthwhile pursuits even as it stands.

QUOTE | "Most books can be much shorter. Definitely a useful task for [large language models] to summarize the salient points of a book." – In a post on his glorified Nazi bar/website last week, AI advocate and investor Elon Musk unintentionally sheds light on why he never met a classic sci-fi dystopia he didn't try to usher into reality.

I'm actually quite confident this current AI mania will produce its share of success stories, partly because the term "AI" has been stretched beyond all meaning.

QUOTE | "[AI is] a marketing term. Right now it's a hot one. And at the FTC, one thing we know about hot marketing terms is that some advertisers won’t be able to stop themselves from overusing and abusing them." – The Federal Trade Commission, warning people earlier this year that it is keeping an eye on how people sell their "AI" products.

A lot of the great new innovations we hear touted as AI are actually not that new or innovative

As a result, a lot of the great new innovations we hear touted as AI are actually not that new or innovative, something that was made abundantly clear to me during the Game Developers Conference last month.

Having long since grown tired of the AI hype train, I wasn't terribly interested in spending more time with people talking about what AI would supposedly be able to do next. (If that's what you had hoped to read, I would recommend Nathan Grayson's thorough recap of the not-ready-for-primetime AI NPC tech at the show for Aftermath.)

Instead, I wanted to see the victory laps. I wanted to hear from people who were successfully using AI in their games right now. I wanted evidence of some actual proven uses for AI in the industry we know, not some speculative idea of what could maybe be done with AI at some undetermined point in the future if we just throw a few more massive funding rounds at it.

I went to two presentations clearly fitting that description, one from King on "How AI's transformative role in level automation production adds business value in Candy Crush" and another from Schell Games on "AI-assisted player support in the Among Us VR community."

The King session had two representatives from the company talking about the automated testing tools they use for the ubiquitous mobile hit, and while I have no doubts about the effectiveness of the process described, I was not exactly blown away by the whizzbang technology on display.

Candy Crush screenshot mid-match. An 8x9 grid of candies of various colors are shown, with a number of effects happening on the bottom half of the screen as various candies are matched.
Making a Candy Crush level doesn't seem like the most ambitious use of AI, but even that leans heavily on human designers

At the risk of undersimplifying, a human designer creates a Candy Crush level, then they have a playtest bot designed to mimic human behavior play that level an obscene number of times. They look at what percentage of times the bot won the level, and then they assign that level a difficulty score.

They also have a second bot they use to determine tweaks that could be made to the level to bring it in line with whatever difficulty score they want. It might change the number of different-colored candies on the board, raise or lower the move limit, re-locate the "blocker" pieces that players are supposed to eliminate, or other such adjustments. Then they put those iterated levels through the playtesting bot again, pick the best-scoring one of that bunch, and repeat the tweak-and-test process again until they have a handful of solid options for designers to pick from.

This was called an "evolutionary generative algorithm," which is a very fancy and impressive name that certainly sounds cutting edge. And I have no doubt of its effectiveness, but this sounds more like automated testing than artificial intelligence, and that's not exactly a cutting edge development in games.

QUOTE | "Invariably, when we start talking about quality assurance and testing it's not long before the talk turns to automation." – EA Bioware director of quality assurance Alex Lucas says he has spent most of the past decade talking about such automated testing strategies. And that's not a recent quote; that's the lead sentence in his 2013 blog post on the subject, so he was apparently knee-deep in today's AI trend since the days when people thought the height of AI was "the aliens in this PS2 shooter are so smart they will hide behind cover."

As for the other AI session, it was focused on Schell Games' use of Modulate's ToxMod voice moderation tool in Among Us VR.

Among Us VR group shot of characters wearing goggles and celebrating. One is giving two thumbs up to the camera
Schell Games gave its AI-powered voice mod tool a thumbs up

ToxMod is billed as using machine learning tech to monitor in-game voice chat, flagging snippets of potentially problematic conversations and sending them to human moderators to determine a course of action. Schell then built its own moderation tool around that, detailing incident reports against players, previous bans, appeals, and allowing bans to be adjusted or lifted.

The game also allows players to report others manually and, initially, the development team heavily prioritized those manual reports over ToxMod's flagging. But not for long.

QUOTE | "As time passed and we learned more about moderation, we learned those manual reports were less and less important. It turns out it's really fun to just report your friends. Also unfortunately, sometimes people weaponized the tool… we found that upwards of 90% of those manual reports are not viable." - Schell Games senior players support specialist Laura Norwicke Hall explains why the studio stopped prioritizing manual reports in its moderation queue.

While Schell Games' experience using Modulate's ToxMod was clearly positive on the whole, Norwicke Hall did note some problems with the tech. First of all, it had to be tweaked somewhat understandably, because ToxMod was prone to flagging players talking about murdering people, which is not always helpful in a social deduction game where players try to figure out who is murdering everyone on a space station.

Beyond that, ToxMod also had trouble identifying "stealth toxicity" like grooming or radicalization, and it had difficulty discerning sarcasm. There may also be issues depending on where a game's player base is located, as ToxMod doesn't work with every language. (Modulate says ToxMod can moderate in 18 languages currently.)

QUOTE | "The AI tools get it right a lot, but we still need people involved in the process." – Norwicke Hall says the tech isn't ready to fly solo yet.

Like automated testing, speech recognition tech is nothing new, even in games. Apple launched Siri a dozen years ago, Seaman and Hey You Pikachu first launched in 1999, and those were of course following in the footsteps of government and tech projects in the field from decades before.

I don't doubt machine learning has made the technology better these days, but this is not world-changingly new in the way AI's most vocal proponents have promised it would be.

QUOTE | "movies are going to become video games and video games are going to become something unimaginably better" - OpenAI CEO Sam Altman, on the aformentioned Nazi bar / website last night.

I guess if you can't get the AI to stop spouting hallucinatory nonsense when it tries to imitate people, then the strategic alternative is to normalize people spouting AI-style hallucinatory nonsense? Masterful gambit, sir.

Not since blockchain has a technology caused this much agony for people forced to come up with generic stock illustrations representing it.

What do AI people mean when they say AI?

Fields like automated testing and speech recognition may be where we see the clearest benefits of the grab bag of technologies bundled together under the AI moniker these days, but they aren't the reason for the investment in the space.

The reason for all this investment is really due to generative AI and the assortment of text- and image-generating toys that can take a simple prompt and deliver something that is honestly more impressive than you might have expected computers to be capable of a decade ago.

It's a bit like the proverbial room full of monkeys clacking away on typewriters. We may not have time to wait around for the collected works of Shakespeare, but if you were a few days into the exercise and one churned out the shooting script for 2005's Son of the Mask... Well you wouldn't think it was good, per se, but it's still impressive, in its own way.

As for why that prospect has the investor class all hot and bothered, it's because for these people, "Do more with less" is not a dreaded thing to hear from one's boss but an aspirational goal on par with "Buy low, sell high."

The heart of the AI trend has always been to make the creative process cheaper by automating work currently done by people

Sure, doing more with less doesn't exclusively mean laying off people because you can have computers do the work instead. For example, I'm cautiously optimistic that generative AI can be used to lower the burden of repetitive scutwork, and that some creators will find it helpful in brainstorming ideas.

And those are both beneficial outcomes for developers, but again, they aren't at the heart of this trend. Whether or not anyone pursuing this tech is foolhardy enough to say it out loud, the heart of it has always been to make the creative process cheaper by automating work currently done by people, thus eliminating the need to employ those people.

Just look at EA CEO Andrew Wilson talking about neural networks and machine learning seven years ago.

QUOTE | "What we know about neural networks and machine learning these days is that you can feed [a computer] every poem that Emily Dickinson has every written, then give it a subject, and it will write you an Emily Dickinson-esque poem that to a layman like me is indiscernible from the real thing. You can feed a model every painting Monet has ever done, give it a photo and it will paint you a Monet that to the layman is indiscernible from the real thing." – Speaking with Glixel in 2017, Wilson promises that from 2017 to 2022, technology would "change the way that games are made and experienced more than anything that happened in the last 45."

I mean, there was a pandemic and everyone can work remotely now, so I guess that's one argument in Wilson's favor. But it's been seven years now and the way games are made and experienced now mostly seems kinda the same, or at the very least the change from 2017 to 2024 seems less pronounced than the one from the seven years before that. That would be when digital distribution, mobile gaming, and all their attendant business models were really taking off, bringing tracking and analytics tools to the fore in a way that really did change the way games are both made and experienced.

But that's not why I'm bringing up that quote. It's hard to predict the future, and being off by a few years – or even just being wrong about something you haven't bet your core business on – isn't that big a deal.

There's clearly a market out there for an endless supply of slop pumped into the media feedbag people keep strapped to their faces

The reason I'm bringing up the quote is because of Wilson's point about the products of generative AI tools being inferior to the real deal, but good enough for the masses. Because if there's one thing I have learned watching the tech and media industries over the past couple decades, it's to never underestimate what kind of substandard offerings people will accept as good enough if something is convenient and/or free. (And then once people have made something a habit, I've continually been surprised and dismayed at just how much worse it can be made before they finally decide to break that habit.)

There's clearly a market out there for an endless supply of slop pumped into the media feedbag people keep strapped to their faces. As a society, we also seem to be decreasingly picky about exactly what slop goes into that bag, content to let an algorithm usher us from one thing we didn't ask for to the next, willing to settle for whatever the services we subscribe to have to offer us instead of something available elsewhere that we actually want, but might have to do or pay something to get.

I'm deeply skeptical that generative AI will be used to produce better books, better movies, or better games. I am considerably more worried it will be used to produce the equivalent of another sequel to The Mask, the shoddy sort of knock-off that, as Wilson put it, "to the layman is indiscernible from the real thing." Or at least not so intolerably inferior that they would be roused from their habits to demand something better.

The rest of the week in review

STAT | 0 – The number of good excuses I have ever heard from executives and owners who have laid people off without severance. That stat remains unchanged despite former Prytania Media CEO Annie Delisi Strain finally addressing the closure of Crop Circle Games this week.

QUOTE | "What I did not understand was that the current economic downturn in the industry is not just another economic cycle — it is a permanent and sustained alteration and contraction of the industry we all know and love." [Emphasis in original] – In her statement on Crop Circle's closure, Strain explains that she initially thought the company would be fine but she underestimated the scope of the industry's recent troubles.

To be fair, it is still tough out there.

STAT | 4 – The number of layoff stories we ran this week, with cuts at Relic, Gearbox, Ubisoft, and Certain Affinity.

But it's not like the money faucets have run dry entirely.

STAT | 5 - The number of new investment stories we ran this week. Sure, they weren't all eye-popping numbers – Midas Games got $1 million, Lil Snack landed $3 million, and Mika Games secured $10 million – but combine those with A16Z Games' $75 million accelerator program and Bitkraft Ventures' $275 million investment fund, and there are certainly a fair number of people out there who probably aren't buying into the "permanent and sustained alteration and contraction" assessment.

QUOTE | "We were getting congratulated left and right at GDC about leaving the evil Embracer. But these are the nicest people you've ever met. Lars [Wingefors, Embracer CEO] has an archive of video games. He loves games." – Saber Interactive founder Matthew Karch seems to think people are criticizing Wingefors because he's mean and not a real gamer, when they're actually criticizing him for the entirely predictable fallout of betting the company's health on someone else's willingness to put $2 billion into a company that was being run in such a way that it would face catastrophic consequences if it did not get a $2 billion injection of cash at that very moment.

QUOTE | "When the markets are supporting this manic kind of growth and you're in a situation where you could take advantage of that, it's hard not to leverage that. It's hard not to leverage a strong share price to go out and buy assets." – Karch explains that it's OK when people take actions that could lead to significant personal gain even though they risk creating tremendous human misery because of capitalism.

QUOTE | "I blame being a publicly traded company for some of the woes that Embracer has. And I blame the fact that people are trying to take advantage of other people's misery through shorting the stock as something that has resulted in a depressed share price for the company and thus some of the layoffs." – Karch explains that it's not OK when people take actions that could lead to significant personal gain even though they risk creating tremendous human misery because of capitalism.

QUOTE | "We are getting approached... on a weekly basis by companies that would like to acquire certain assets within the group. And I've been very clear that they're not for sale, because they're a very important part for the group and for the shareholders of the group going forward." – Wingefors signals an end to The Great Embracer Fire Sale of 2023-2024.

QUOTE | "Speaking to a major European retailer last week, I was told they 'can't shift anything with an Xbox logo on it'. And during GDC, two games publishers/developers independently told me they were struggling to justify supporting Xbox platforms." – Our own Chris Dring drops some industry scuttlebutt in an opinion piece on the seemingly stagnant console space and what happens in a year where market leaders Sony and Nintendo don't seem to have much on offer.

QUOTE | "With Korea saturated, China highly regulated despite having hundreds of millions of PC gamers, and the rest of Southeast Asia having low per-capita income despite a large PC player base, the only feasible option was to enter the Japanese market." – In a feature on the shifting Japanese gaming market, Niko Partners' Darang Candra explains why Microsoft has taken a greater interest in PC gaming in the country.

STAT | 4-0 – FTC Commissioners voted unanimously to deny the ESRB's petition to make facial age verification technology an acceptable way to get parental consent to collect children's information in games under US law, an effort we talked about in this space last year. The FTC didn't weigh in on the actual merits of the application though; it basically just punted on the question and suggested the ESRB could re-file the application after another governmental body finishes its pending review of the tech.

QUOTE | "Guess I'll just piss on the floor" – The kind of thing you get to put in headlines on a serious trade website when you cover an industry as goofy as this one.

Read this next

Brendan Sinclair avatar
Brendan Sinclair: Brendan joined GamesIndustry.biz in 2012. Based in Toronto, Ontario, he was previously senior news editor at GameSpot.
Related topics