Unfortunately, business is business, and what’s apparently best for game companies isn’t always best for players. A number of questionable features in new video games (and upcoming consoles) have turned otherwise happy gamers into teeth-gnashing trolls.
Some of these features are meant to curb piracy. Some aim to boost revenues. Some attempt to promote technology. Others? Well, we're really not sure why anyone thought they were a good idea. Here are five new game features we desperately wish we could unplug.
‘Always-on’ Internet connectionsIf you tried to play SimCity earlier this month -- only to be denied -- you know the pain of mandatory Internet connections.
When you buy a game, you want to be able to play it immediately. That seems simple enough. But when a game requires you to log into the company's servers every time you want to play -- even if it’s a single-player experience -- it can be a serious headache. Internet’s out? You can’t play. Servers down? You can’t play. That might be common to massively-multiplayer games like World of Warcraft, but seeing this affect games designed and marketed for solo play is worrying.
It's not a new problem, either. Blizzard faced it in 2012, when millions of prospective Diablo III players faced the ‘Error 37’ message instead of the game’s opening cinematic. Similar issues marred the launches of World of Warcraft and Star Wars: The Old Republic.
Publisher promises of "being ready" ring hollow these days, and gamers are already fed up with buying games only to find them unplayable due to connection issues. Unfortunately, we're powerless to do anything about it. If we want to play these sorts of games -- and when the game is as good as SimCity, we absolutely do -- we're forced to deal with this sort of frustration.
Endless system updatesIt's great that console makers are able to update their system dashboards to include new features. We just wish they'd do it a bit lot less frequently.
Large system updates are a nearly constant occurrence on the PlayStation 3 and PlayStation Vita, often with no discernible results. The Wii U needed one right out of the box when it launched. Even the Xbox 360 (which tends to require just one major update annually) still has smaller mandatory updates throughout the year.
Waiting for your system to update itself when you want to play a game is maddening. Rather than the console downloading these updates in the background as you play, you're forced to sit staring at a sluggish progress bar.
The PS4 won't demand a permanent Internet connection, but Sony's nudging players in that direction. And whether Microsoft will require one remains to be seen. But let's hope the companies are smart enough to push those upgrades out when systems are dormant, rather than right when they're turned on.
Touchpads everywhereRemember how cool it was the first time you toyed with your smartphone’s touchpad?
Now remember how cool it was when you tried using it on your dedicated gaming device?
Yeah, we don't either.
There's certainly a place for the touchpad in gaming – specifically, your controller-less phone – but the success of the mobile/tablet market has driven game makers to cram it everywhere. Sony put TWO on the PlayStation Vita, but that still couldn't hook players. Nintendo dedicated a huge chunk of the hefty Wii U controller to the touchscreen. Sony's even sticking one on the PlayStation 4's DualShock controller, though we’re not sure what it actually does yet.
They seem innocuous enough, but all those touchscreens still feel like gimmicks. This is a problem that may resolve itself in a year or two, but for now, those screens are just getting in the way -- and running up the cost of hardware to boot.
Money-grubbing downloadable contentSome downloadable content is absolutely, positively worth the investment. When Rockstar extended Grand Theft Auto IV with a trio of episodes, for example, no one in their right mind complained about the quality or value proposition.
Other DLC? It's a pure cash grab. The most infamous example came in 2006 when The Elder Scrolls IV: Oblivion asked players to pay about $3 for a piece of virtual horse armor. (Bethesda has since learned its lesson, mostly.)
More recently, EA raised a stink by adding a ‘pay to win’ model to Dead Space 3, then stuck its foot in its mouth by saying it was interested in having microtransactions in all of its games. The company has backtracked from that statement, but microtransactions are still a huge part of the gaming landscape today – and it’s only getting bigger. They’re even cropping up in Call of Duty, the biggest video game in the world.
The problem isn’t in microtransactions themselves, but in how companies use them to squeeze money out of consumers. In free-to-play games, dropping a few extra bucks is no big deal since you never made an investment to begin with. But after shelling out $60 for a new game, being nickel and dimed for small items that could have been included initially is only going to frustrate players more in the years to come.
Unnecessary multiplayer elementsPerhaps because of the success of Call of Duty, whose multiplayer components make it a year-round hit, more and more games are adding multiplayer modes. And in some cases, that's a good thing.
But a growing number of single player games that have no need of multiplayer are jumping on the bandwagon in an effort to keep people playing after they’ve beaten the main game.
Tomb Raider, for example, is a fantastic reboot of the series -- and while there's nothing particularly wrong with the game's multiplayer, it's wholly unnecessary. We’ve seen it wedged somewhat awkwardly into games like Assassin's Creed III, Far Cry 3, and, most famously, BioShock 2. By and large, these games would have been just fine without multiplayer, and the creators could have spent that development time adding even more flair and polish to the single-player games.
It's bad enough that single-player experiences are become less and less common. To have them paired with sub-par multiplayer is just adding insult to injury.
- Video Games
- Sports & Recreation