As E3 closes up shop for the year, we finally have a moment to reflect on a week of gorging ourselves on news about the continuing console war between Microsoft and Sony (and, uh, Nintendo). It's all very new and exciting news about the next half-decade of gaming, but in a lot of ways, it feels like the same old fight we've been having for decades now. It's time for something new.
So much of console gaming seems complacent or even insistent on remaining exactly as it always has been. You might buy some of your games online now, and you have a hard drive now. Voice and motion and second screen exist. But the basic building blocks of what makes consoles consoles remain intact, warts and all. And it's past time we found a better way.
On the one hand, backwards compatibility in a console can and should be written off as a luxury that's nice to have, but not especially necessary. No one is forcing you to sell or break or dispose of your old consoles, and you can always buy old consoles inexpensively even decades after the fact.
On the other hand: What in the actual fuck? Video games have for decades been dying to be considered equals with other forms of entertainment. By now, everyone is mostly in agreement that at a minimum they have the capacity to be artful and tell smart, engaging stories in creative ways. But unlike every other content medium, games come with an expiration date. Want to play Final Fantasy 12 right now? Better dig out your PlayStation 2.
The closest analog to this dead-ending might be out-of-print books. There's an entire century of books missing from print, basically, because publishing houses are waiting on their copyright to expire. Similarly, old games eventually return, as emulated titles on phones and PCs, or as retro titles, ported or remastered on new systems. It's Square Enix's entire business model at this point. But there's still a lag between old and new that lasts entire generations. A single console that plays All The Games would be much closer to the platonic ideal, right?
Sony is the closest to making this happen, by allowing cross-device gaming with the PSP and PSP Vita for the PS3 and PS4. It's also got a massive number of ports from the original PlayStation on PSN. As a retrofit, that's a pretty good solution. But going forward, Gaikai—what Sony calls its cloud-based game archive—streaming seems, at best, like an always-connected headache, and at worst, a possibly unplayable disaster.
Nintendo, meanwhile, has an admirable history of making itself as backwards-compatible as possible, even if that makes for awkward hardware contortions. It is still, despite all that, locked into the new Wii U with only its catalog of Wii games emulated, and no integration with its wonderful mobile game library.
And the Xbox One just doesn't play anything but Xbox One games.
The funny thing is, the answer is staring everyone in the face. Android-based gaming consoles like Ouya are beginning to creep into existence, and iOS won't be far behind. The problems these boxes will face are totally different from traditional consoles' challenges, but backwards compatibility is never an issue for either. Apps might be missing new features, or might not be optimised for new screens, but they're never more than a little tinkering away from being just fine. It's a much better system than tossing a pile of plastic discs in the trash heap every seven years.
This feels sacriligious to even say, but should we maybe take a step away from the graphics arms race?
OK, hold on, hold on, hold on, hold on, hold on. Stop t—Ow. Stop throwing things. And put down the halberd.
This isn't to say that we stop demanding beautiful visuals. Of course we want game graphics to give us a hard knock in the pants. But a large part of console cost goes into improving visuals that will feel stale in a few years no matter what. And frankly, if you were going to demand the very best graphics, you wouldn't be using a console anyway. You'd be on a gaming PC.
Instead, since the PS4 and Xbox One are at this point every bit as much for streaming content as they are for playing games, why not work toward optimising all the ways you use the system? A more wholistic method of improvement, instead of just one very sharp katana blade and a bunch of decorative magnets slapped on the side.
Microsoft's already taken steps in this direction; it dedicates some of the X1's resources to the system, and not gaming. (The Xbox One's OS uses 3GB of the X1's 8GB of RAM so that it can switch between games and other features seamlessly. It's possible its specific balance is off—this is new for everyone.) But it's also an argument against gameplay and creativity and artfulness being forgotten as we demand all weapons readied for graphics. Because the way graphics technology is progressing, it doesn't matter how many current resources you throw at graphics, your hardware will seem ancient in a few years—at which point a balanced system might not seem like such a bad idea.
It's not clear if Microsoft's specific split-OS gambit will pay off; there are arguments for and against, and if it really does impair the experience of games, that's a problem. There's a good chance that's what's going to happen to Nintendo's Wii U, which is far underpowered compared to the PS4 and X1, and suffers from interminable load times. But the the idea that a small difference in graphical output—assuming the difference materializes there and not in load times or framerates—affects your game experience is an outdated worldview. Graphical horsepower isn't necessary to make a beautiful game. Can you imagine Ni No Kuni looking much better with a little more RAM? Me either.
The entire idea of video games as an accessible medium has one big sticking point: Price.
Consoles have always been the more "affordable" option compared to, say, a tricked-out gaming laptop, or even a £1,000 rig with Intel inside. But unlike, say, someone who isn't very interested in music but wants to see what all the fuss about Daft Punk is about, a non-gamer who randomly bumps into a Zero Punctuation video about BioShock can't reasonably pop out and try it without significant investment. On a new console, buying and owning a single game to play will run you half a grand, at least. That continues to be nuts.
So what's the fix? Both Sony and Microsoft now require you to pay a monthly fee to game online, and Microsoft to use any online services at all. Why not take another look at subsidized consoles? Microsoft kicked the tyres on subsidised Xboxes with a £100 Xbox 360/Kinect deal that locked you into Xbox Live Gold for two years at £10 a month — more than the current cost of buying XBL Gold on its own. Couldn't something similar work for the systems? Like mobile phones? Maybe you could even bundle that contract with services like Netflix, which would probably pay the console makers a nice fee to guarantee customers for years at a time. The point is, there are creative ways to get the price down. Which is important, because of what a lower entry price for new hardware allows: Faster releases.
No one wants to blow £300 or £400 on a new console every two or three or even, probably, four years. But plenty of people are fine picking up a new phone for £200 or £300 on two-year cycles. What if you emulated Google's Android policy, and said all new games must work on hardware that's within, say, two generations of current. Building on a unified platform (remember that one?) would obviously be crucial here. But picking up a new PlayStation every two or three years—a much more reasonable period for hardware to stay current—would be stomachable if you knew you could hang onto it for six years if you wanted, or trade it in and upgrade for a few hundred quid if you wanted a new one.
The great advantage of making a gaming console is uniformity. For players, it creates a level playing field where the only differentiating factor is skill (and connection speed and free time, but that will always be true). For developers, the parameters are set and minimums and maximums are clear. And faster release cycles and games running on multiple systems open the world up to fragmentation, which is a dirty, dirty word. A regimented release cycle—guaranteed to happen at certain times—largely tempers those issues, though. It introduces a few variables, instead of the infinite possibilities in PC configurations and Android phones.
There are of course iterations of consoles during generations, as well as price drops, but those don't refresh internals in a meaningful way. And you can always just buy an older system, which will have a load of cheap and wonderful games to play. Those are great options for a lot of people, and will continue to be. But games and their hype machine work on new. Online multiplayer for games like the yearly Assassin's Creed becomes much more sparsely populated inside of a few months. Your friends will definitely not play FIFA 2011 with you online. And good luck finding anyone who's still excited to strike up their five thousandth conversation about BioShock Infinite in a few months.
So what makes sense for consoles? The next consoles, whenever those finally arrive? It's actually pretty simple—and achievable.
Play all of our games, or at least all of the games your lineage played. Have great graphics, but understand you're more than a cog in a rendering farm. Be more affordable, even if it means getting creative with how you have to accomplish it. Maybe come out more frequently, because really, it's awkward when our phones surpass your capabilities.
Most of all, try new things. Be different, and not just different with new controllers or sensors or screens. Rethink the traditional limitations and borders of consoles—we have the technology and infrastructure to do things very differently than we did in the old console wars. Let's start acting like it.
And if Microsoft and Sony and Nintendo don't, you'd better believe someone else will.