Phil Spencer made some interesting remarks during a recent San Francisco media briefing, in which Microsoft showcased a number of new cross-platform games for Windows 10 PCs and Xbox One. Spencer noted that the future of Xbox could lie in the arms of upgradable hardware, something we’re yet to see come to consoles.
Consoles aren’t all that different from PCs, from a technical standpoint at least. But there are reasons for and against buying a console compared to a PC. The most notable one is obviously upgradeability. A console is also known as a generation device, meaning it’ll last 7-10 years using the same hardware, never being upgraded. What this allows for is a consistent, fair experience across all gamers playing the same game.
There are of course downsides to this method as well, it strains developers from being able to achieve better graphics and more performance, especially towards the end of a generation's life cycle. PCs on the other hand are the opposite of this, they can be upgraded whenever needed, always being kept up to date with the latest games and graphics enhancements. But the downside to this is the gaming experience is not the same across all gamers, some may have powerful high-end custom builds whereas others may be using a cheap $200 OEM build. Performance will always vary.
Since there are advantages and disadvantages on both sides, it’s not easy to say which method is better. But what if there was a middle ground? Xbox One could be the first in this new area of gaming hardware, with the console being mainly generic but with upgradability options for those that need it, yet still maintaining a relatively fair experience across all gamers. Let me explain.
One of the biggest issues with PC gaming is that you can pretty much run any game on any hardware, in lower-end hardware the game will run rather poorly, but on high end hardware the game will run great. There isn’t really a benchmark for developers to target, each game usually requires a different set of specification requirements each time, meaning hardware will need to be upgraded frequently.
With an upgradable Xbox One, Microsoft could set a minimum for developers to target. An example of this is simple; Developers keep making games for Xbox One like they always have been, but instead of building a game for high-end hardware and then optimizing down, build a game for the minimum required hardware and then optimize up.
A result of this would be current Xbox One gamers can still enjoy all released games for the cycle of this generation and never be unsupported when a new game comes out. At the same time, if you’re someone who enjoys playing games at a higher framerate or resolution, you could upgrade your hardware throughout the consoles life cycle for an improved gaming experience. Both gamers still get to enjoy the same great games.
Of course, this method would mean the experience isn’t the “same” for everyone, but at least via this method gamers will have a good experience from the very beginning. The problem with PC gaming is that there aren’t really any rules for developers to go by, meaning it’s very easy to find gamers playing games at terrible framerates or low resolutions.
Microsoft could even do it in a more simple way, with SKUs instead of actual upgradeable components. An Xbox One Pro with more RAM, better graphics chip and overall better game performance. This is a route that's more likely, but the main premise still stands.
If Xbox One were to take this route, it would be a daring yet exciting change in direction for console gamers. This method could be a huge success for Microsoft, or it could kill the Xbox brand as a whole. Of course, all of that is up to the gamers, so I ask, is this a good idea? Should Microsoft stick with non-upgradable consoles or allow developers to target a minimum specification set and allow gamers to upgrade from there? Let us know below.