It’s a well-established fact that at the beginning of each console cycle, there is typically a surge of new IPs and innovation in the industry. When Sony announced that they would keep the price of PS4 games at $60, I started thinking about how that would affect developers struggling with higher production costs.
After an extended process of thinking, I have to question what will truly be different about the next console cycle, at least concerning Sony and Microsoft, that could somehow unleash a sea-change of innovation.
I started considering a number of factors, largely in the realm of economics, but some psychology as well. And at last, I have a theory that I hope to test in the future.
I call it the Inflation-Expectations Hypothesis of Videogame Innovation. Yes, the name is wordy, but the name delineates three major points of my idea: the effects of monetary inflation and consumer expectations on innovation in videogames.
The hypothesis works like this: As time passes, money loses real value. Since videogame prices are locked for the entire duration of a console cycle (5-7 years, nowadays), this means that real return on investment will be greatest early on in the console cycle. (If a game pulls in $30 million in revenue, that $30 million is more valuable in 2007 than in 2012, for example.)
If a game flops commercially, then it flops. But for any given number of copies sold, a videogame publisher will maximize its real return early in the cycle. Why? As time passes, the real (inflation-adjusted) revenue for each sold copy decreases, while development costs increase as salaries rise in proportion to inflation. Thus, when publishers want to take risks, it’s best to do so early in the console cycle.
According to the inflation calculator from the Bureau of Labor Statistics, $60 in 2013 is worth only $52.53 in 2006. Conversely, publishers would need to charge $68.54 today to get the same value as $60 from 2006. Clearly, it makes more sense to take risks early on, when even if a game flops, whatever sales do occur will be maximized as a return on investment.
The 7th generation of consoles would seem to adhere to this hypothesis. 2007 and 2008 saw the rise of Assassin’s Creed, Bioshock, Mirror’s Edge, Mass Effect, Gears of War, inFamous, LittleBigPlanet, and countless more. By contrast, the biggest games of the 2012 holiday season were all sequels in established franchise: Assassin’s Creed 3, Call of Duty: Black Ops 2, and Halo 4.
Yes, 2012 also saw Dishonored. That would not correlate with my hypothesis. However, I never said that this was an absolute rule, merely a tendency. Bethesda took a risk in financing Dishonored, and clearly it paid off. But this is the exception, not the norm. Whereas 2007-8 saw new IPs as the leading games of the year, 2012 is largely a year of sequels. (As was 2011, for the most part.)
Second, it’s cheaper to expand on existing IPs. For publicly traded companies, it looks better on one’s quarterly reports to be working on existing IPs than developing new ones. Why? Because for old IPs, you can avoid listing expenses until the game gets released. For new IPs, you have to list expenses as they are incurred, and since the development cycle for a game is 2-3 years, that means 2-3 years of expenses for an unknown product, and the company must soothe concerned or questioning investors the entire time. It’s easier, as a matter of finance and executives’ stress, to work on existing IPs.
And if we go back to Dishonored, it was published by Bethesda, a subsidiary of a privately-owned company, ZeniMax Media. Since ZeniMax can make decisions without noise from shareholders, and because they don’t have to submit public quarterly reports for public shareholders to pore over, there was less of a disincentive to fund a new IP late in the console cycle. There was less to weigh against funding a new IP.
Third, there is also the fact that publishers want to hold down costs, and one way to do that is to reuse existing assets. Do you know one reason why Ubisoft pumps out an Assassin’s Creed title every year now? The engine, the level of graphical fidelity…it’s all mostly there. The only genuinely new expenses are the costs to create new assets (different cities = different architecture), new character models, and the story (costs to hire writers and cast voice-actors). So much of the game is already there, and the employees have already gone through the learning curve: despite rising costs from inflation, companies have a strong incentive to reuse existing assets in order to hold down costs.
Those are the main monetary reasons for taking risks early on. For any given number of copies sold, the return on investment is maximized. Since most people (and thus companies) are risk-averse – that is, they demand a greater return as compensation for more risk – the early years of a cycle offer the greatest opportunity to capture a greater return, assuming that sales (the # of copies of a game that are sold) is exogenously given.
See? In order to accept more risk, an investor (videogame publisher) demands a greater return as compensation.
So where do expectations come in?
Simply put, there is a social expectation – from consumers, from shareholders, from the companies themselves – that developers are supposed to innovate at the beginning of the console cycle.
This is deeply rooted in the technological advancements that have typically defined each console cycle. As both graphics, processing capabilities, and the controllers themselves evolve, there is an expectation that developers find new, innovative ways to utilize these extra capabilities. When the SNES offered more buttons, greater storage space, better RAM, and greater technological capabilities than the NES, we saw an evolution in visual aesthetics and control schemes. When the N64 offered a 360-degree control stick, we saw 3D platforming take advantage of it. When the Xbox and PS2 offered two control sticks, we saw an evolution in control: where one stick controlled character movement, and the other controlled the camera. This was crucial in genres like the First-Person Shooter.
Consumers expect innovation early in the cycle, and they’re more willing to buy new IPs. Shareholders expect companies to innovate as the console cycle begins, considering it a necessary R&D expense to develop new revenue streams (as newly popular franchises). And companies respond to these incentives by innovating before settling into a stable pattern of sequels.
But is that really going to happen now?
Two main things are discouraging me from hoping for innovation in the next cycle: Game Prices and Technological Advancements.
This mainly applies to Sony and Microsoft. I have no idea what Nintendo is doing with the Wii U, but they made sensible steps. The new gamepad will encourage new methods of play, and by increasing the price to $60 for new games, companies can gain the same real revenue now as they could when the Wii was first released with $50 games.
But Sony and Microsoft? Sony kept the prices at $60. Thus, companies game no more real revenue for making PS4 games than they do currently for PS3. They’re both the same price, and at least with the PS3, companies have already mastered the learning curve. The PS4 represents a whole new creature that they’ll have to take time to understand, burning through money all the while to take advantage of the higher graphical capabilities.
And the technological advancements? If anything, they’ll only spiral costs without creating ways to innovate. The Dualshock 4 controller is essentially identical to the Dualshock 3 controller.
I’ll go through the ‘changes’ in order to avoid seeming facetious. Yes, the touch pad is interesting, but it’s unclear what someone can do with it. They created an LED light, but it’s more of a gimmick than a revolutionary change in the controller. The sixaxis and control sticks are more sensitive. Big whoop. And there’s a “share” button for social media. Again, big whoop.
The touchpad is the only potentially innovative design choice in the controller. But Sony hasn’t offered information on how it can be used. I get the impression that the touchpad will go the same way as the Wii’s USB input: present, but unused, and thus worthless.
Unless Microsoft revolutionizes their controller, they’ll be the exact same as Sony: $60 price tag for new games, preventing developers from increasing revenue to keep pace with inflation. Higher graphics, requiring higher development costs. And a nearly identical controller to the 360’s. In short, there would be nothing new or even remotely surprising. At least the touchpad has promise.
Consumers will expect innovation, as will shareholders, but will it make financial sense? Without increased revenue, companies are setting themselves up for an impossible situation. Triple-A games are already cutthroat situations: one failure can jeopardize a company. When Ensemble Studios developed Halo Wars, that single game’s lackluster sales resulted in the developer being shuttered. They were the creators of the Age of Empires series, but one single flop was their downfall.
This has played out with other studios. BlueTongue was shuttered after de Blob 2’s flop. (Kotaku has a list of defunct studios since 2006. It’s massive. Zynga Boston was shuttered last year. THQ, a large 3rd party publisher, declared bankruptcy a couple months ago. Hudson, the creator of Bomberman and Mario Party, was shuttered in 2012 after almost 40 years of operations. Pandemic was closed by EA in 2009, Red Octane, a developer for Guitar Hero, was closed in 2010…it’s a gruesome picture. Yet it all emphasizes the desperate need to constant, massive successes.
In this generation, the industry has become hit-driven to an extreme. But failure results in devastating risks, and going into the next console cycle, the absolute best time to innovate, this can cripple the industry.
Let’s hope for new, interesting IPs that offer brand-new stories and experiences. But if we don’t see it, I can’t say I’ll be surprised.