On Wednesday night, at the Hammerstein Ballroom in New York City, Sony will announce its new PlayStation. Shortly thereafter, Microsoft will announce its new Xbox. Because of a torrent of technical leaks, we already know a lot about the new machines: what’s inside them, what powers them, and what doodads will be emblazoned on the box art. The shape of the next-generation game console is becoming clear. And yet, there’s one crucial aspect of the new crop of gaming systems about which we know almost nothing. It’s their most crucial feature, the reason millions of people pay the hundreds of dollars that have turned big gaming into a world-bestriding behemoth.
Specifically, what the hell is a next-generation video game?
Well, we can start with the obvious, which with new consoles is also the superficial. The basic trajectory of computing tells us that next-generation games will look more impressive. How much more impressive, exactly? The creative director of one of the largest game companies in the world told me that the programmers he works with have estimated the new consoles will allow 10 times as many on-screen “assets” (think: objects in your field of view) as current systems. (It’s worth pointing out that high-end gaming graphics appear to be entering a period of diminishing returns — and “10 times” as many assets can mean, say, 10 times as many blades of grass in a field. In other words, expect impressive detail but not as obviously profound a jump as from, say, the Super Nintendo to the Nintendo 64).
It is a rule of development for new consoles that costs rise. So if the new games can harness 10 times the resources as our current games, the creative director said, “you need 10 times as many people working on the game.” The most graphically intensive games now cost anywhere between $50 and $100 million to make, and as that figure explodes, developers and publishers will have to consolidate their development budgets on fewer titles. As a result, expect the number of huge budget “event” games — your Call of Dutys and Assassin’s Creeds — to plummet.
There is almost no question that this high-end market is going to be controlled by the two or three publishers that have the financial and human resources to “brute force” (as the creative director put it) the megabucks sequels that still have massive profit margins. Fewer publishers mean higher costs, and those costs aren’t going to be borne exclusively by game companies. “There is a future for the big cinematic games — Transformers the game — in which they are going to charge you $100,” the creative director said. The rising threshold to make these games will drown mid-size publishers and prevent the entry of new ones. That’s obviously bad for creativity in big games, encouraging as it does iteration on previously successful products.
But rising costs aren’t the only reason the future of mainstream big gaming involves fewer, more expensive, and less creative games. “Those games are made for the most secure audience for video games, and that is 18[-] to 35 [-year-old] males,” the creative director said. You could say, without much argument, that this group of gamers has fairly well-understood tastes. Still, what happens if pandering to those tastes becomes less effective, if the most coveted demographic in the industry gets tired of reliving the asymmetrical wars of the 20th century from behind the barrel of a Kalishnikov, and brute-forcing new wargames stops paying out?
Publishers will have to scramble to find a dominant type to replace the first-person shooter. There is a potential future for these consoles, the creative director told me, in which game companies figure out how to make a MOBA (multiplayer online battle arena, the genre that is coming to dominate computer gaming and which includes League of Legends, the most popular game in the world) playable in the living room, with a controller. These games cost significantly less to make than resource-hungry first-person shooters, and while publishers couldn’t charge nearly the same price for them, they could monetize them in novel, scalable ways (“Go Forth in your own Levis armor !”). MOBAs have a close relationship with an ascendant culture of gaming spectatorship, and it’s not hard to imagine the advent of in-game commercials, say, between games. It’s also easy to picture, given the speculation about video sharing in the new consoles, a game culture in which live video and editor-curated user video content (brought to you by, for instance, Machinima) plays a major role.
But what about the smart, narrative-driven single-player games that made many of us love games in the first place? Ballooning development costs and copycat blockbusters may actually be good for them. In recent years, developers have added multiplayer and cooperative play to story-heavy series such as Mass Effect, Dead Space, and Uncharted in an effort to justify their $60 cost and become more than weekend rentals. The result has been a kind of feature creep that hurts the quality of the single-player game. The future for these games, according to the creative director (and Cliffy B), is probably in digital distribution. At $25 or $30, without the costs of manufacturing and physical distribution (and without the development costs of multiplayer functionality that no one asked for), these games can gross the same amount of money they do today. A mid-size developer, like TellTale Games, which made one of last year’s best games (the downloadable, episodic Walking Dead), could conceivably thrive in such a niche.
Finally, there are independent games, which are without question the most exciting growth area in the industry. In the past week, Chris Kohler at Wired and Ben Kuchera at the Penny Arcade Report have done some really smart writing about the so-called “Minecraft test,” which asks whether any of the next-generation systems will make it sufficiently easy and affordable for small and hobbyist developers to make, update, and distribute games that the next indie supernova could start on a console. That’s probably the hardest part of the new gaming experience to foresee, but whether or not gamers can participate in the thriving indie gaming scene on their consoles or if they will have to switch to a living room computer to do so rests on the answer.
And let’s hope the answer is yes. Because if the disastrous Wii U launch illustrates anything, it’s that gamers, the people who pay extravagantly for these extravagant machines, want more good games. The coming years are going to be a time of unprecedented choice for gamers. Regardless of their horsepower, and their motion controls, and their media capabilities, the new consoles from Microsoft and Sony have to bring these choices to consumers — or the next-next generation of games won’t involve them at all.