Gaming is something that’s been with me for most of my life. I started gaming as a child and continue to play around as an adult. Unfortunately, since I first picked up a controller, I’ve seen the downward spiral that the gaming industry has become; I’ve seen how money has more or less come to dominate above all else, including player satisfaction.
These are just a few of the things I hate about gaming in 2017, but I also feel that these are the points that best represent how the industry has gone wrong, especially in the last five or ten years. There’s always a possibility that developers can fix these issues that I’ve listed below but it’s equally likely that they won’t and things will get worse before they get better. Regardless, these are a few key points studios need to focus on.
Multiplayer Only Games Shouldn’t Cost the Same As Complete Games
Video games are experiences; they take hours of you life away as you delve into a virtual world. I believe that since the dawn of the home gaming market that this experience has been charged for accordingly. The general cost of a new game can range anywhere from $40 to $60, with triple-A titles leaning towards the $60 end of that spectrum. In most cases, I’m okay with this, especially when a game features a quality story mode or offers a vast amount of replayability. Games like Skyrim or Fallout fall under both of these categories.
What I struggle to accept is when a game lacks a dedicated single-player campaign and the developers still feel that a $60 price tag is fair, despite the additional costs created by season passes and a required online subscription to actually be able to be play the game with others around the world.
Furthermore, a game that lacks a single-player campaign can have a much more limited lifespan than one that has a single-player mode. Multiplayer games, unfortunately, rely on their community to keep going. If that community completely dies out, so does the ability to enjoy your game to the fullest extent. While it always sucks to see a game’s community die out, it’s easy to swallow if you only paid $30 or $40. Loosing $60 is never an easy thing to accept.
Buying a Game Shouldn’t Entail Getting Harassed to Buy a Season Pass
Add-on content for games is something else that has changed drastically over the years to the point where its part of the usual game buying process. A trip to any store to purchase a new release, especially at specialized stores like GameStop, pretty much guarantees that you will receive a pitch to pick up a “season pass” while you’re in the store. The basic summary of a season pass is that you pre-purchase all of the future DLC for the game up front and save a few bucks in the process.
Normally, this would be seen as a good thing as no one would complain about saving money. But gaming has the uncanny ability to take good things and warp them to bad things. For starters, there’s no standardized price for a season pass – one game may charge $20 for theirs, while another may charge $50 instead. Despite the high fluctuation in price, you don’t even always know what you’ll be getting for that additional money. There have been times when season passes have been advertised and sold, but the developers didn’t even reveal what would be included in the DLC pack for the game.
There’s also the risk that the DLC content wont be worth the price asked by the developers. If the season pass is $30 and it only contained several skin packs, would you consider that a sound purchase? What if that $30 only netted you about 4 hours of expanded content – would that be worth the price? What makes this all worse is the possibility that you paid for a season pass without knowing you were getting this content. As far as you and all other buyers are concerned, you are getting content equal to at least what you paid, but that may not always be the case.
I understand where the concept of seasons passes came from, though. In the 30+ years that home gaming has been a thing, the cost of making a game has skyrocketed while the prices of the games for sale have virtually remained the same. A game in the 1980’s could be made in a fraction of the time that games are made now – and at a fraction of the cost. Technological advances have put the time and cost of making a current generation game on par with the production of a “low budget” film. Due to this, developers need to find a way to recoup their costs somewhere, and it seems that season passes are the go-to method. That doesn’t mean the practice as a whole is beneficial to every party, though. In fact, it creates a second issue.
Games Aren’t “Complete” Upon Release
Continuing on from what I said about season passes, it feels like games that are released now are generally not finished. To be clear, this doesn’t always mean that a game requires a season pass to unlock a large portion of what should have been included in the initial release of the game. But this does happen more than it should. More often than not, games now receive day one patches meant to unlock time sensitive features and fix various bugs that were squashed before the game went gold. While the internet is a great way to patch games and resolve issues or add features, it has become something that game developers have come to rely on.
When buying a game on day one, there should be an expectation that the game will not contain any serious bugs or issues. The buyer expect it to work as advertised upon purchase. Yet so often a game is released with a game breaking bug that isn’t patched on day one. On consoles, as bad as things can get, they don’t often reach a code red. Playing on the PC yields different results.
PC gamers are proud of their setups, their machines, and the power that they pack. For as many console gamers as there are in the world, there is an equally large number of PC gamers (if not more). Somehow, PC games almost always wind up being the most screwed up. Yes, there are times where bugs slip through the cracks, such as the bug in Prey that caused users to lose their save data in rare cases.
Other times, a game is released on day one that can barely run on a PC. The example that comes to mind for this is Batman: Arkham Knight, which was released for the Xbox One, PlayStation 4, and PC. While the console versions of the game were fine, the PC version was a complete disaster even on high-end machines. Users reported a myriad of issues that included slowdowns, crashes, and severe performance issues. This was acknowledged as a problem that the developers needed to fix.
In anticipation of what turned out to be a monumental event, PC gamers were offered refunds and the game was removed from Steam in June of 2015. In October of 2015 – four months later – the game was back for sale on Steam . . . with a whole new bundle of issues. There was another round of refunds extended to unhappy gamers and more patches were released in the following months. From what I can tell, almost a year after the game’s initial release, the bulk of the PC performance issues had finally been fixed. The macOS and Linux ports of the game had been cancelled, which was probably for the best.
There was a time many years ago where a game that was in that kind of shape would have seen one of two fates: it would have been released and promptly destroyed by critics and players (a la Bubsy 3D), or it would have been pushed back or canceled altogether. In 2017, with the cost of making a game, a studio can’t afford to let a game just fade into oblivion, so they are forced to release an inferior product and hope that any bugs will be quickly squashed.
Another highly notable example of an unfinished game sent out to the world is Street Fighter V. The game was released in February of 2016 with arcade fighting and multiplayer functionality, but it lacked any campaign mode. The game’s cinematic story mode wouldn’t be released until June of 2016, much to the dismay of fans. Some people don’t like playing online multiplayer and prefer to play through a game’s story. Those people had to wait four months after the game’s release to play the part of Street Fighter V that they enjoyed.
Then you have other blunder-filled game releases like anti-climactic No Man’s Sky, a game hyped up by Sony that fell flat when it was finally released as a hollow shell of what was promised. A huge portion of the features that were advertised prior to release were simply missing. The game was released just over a year ago on Steam yet it holds a ‘Mostly Negative’ rating out of over 75,000 reviews. At this point, the developers have been putting out patches that have improved the game, but still haven’t brought it to the level it was initially advertised at.
I could understand if something like this happened from time to time, but it’s becoming a trend. And that is disgusting.
Microtransactions Belong in Mobile Games
Let’s face it: this one has nothing to do with the rising cost of making games. It’s a glowing neon sign that displays the greediness of some developers. When I saw that Mortal Kombat X was going to feature Easy Fatalities, which could be purchased from an in-game market, I got a bad feeling in the pit of my stomach. Something was amiss. After a few years, it turns out that feeling was well deserved, as most games are now starting to feature loot box purchases and premium currencies all to similar to freemium mobile games. It’s becoming all too common in the gaming industry.
There has to be a line that gets drawn somewhere with features like this, as the last thing I expect to do after spending almost $100 on a game to have the ‘complete experience’ is spending more money to unlock additional skins and loot boxes. Gaming is a business like everything else, but it’s getting to the point where independent developers and retro games are gaining a strong foothold in the market by not having microtransactions. Indie devs can’t afford to nickel and dime their customers the way a studio like EA or NetherRealm can. Indie devs know that they need their consumer base in order to keep them in business, and they have some level of respect for them.
As for the return of retro games, that can be attributed to a combination of the greatness of the games alongside the fact that retro games are, for all intents and purposes, plug and play. You put the game in, turn it on, and play. There’s no online connectivity, no hidden fees, no nonsense whatsoever. And I love that. Modern developers could learn a thing or two from the games of yesteryear.
Like I said before, gaming is a business, but it needs to be put in check. Game developers have lost all respect for their customers to the point where their customers are extremely frustrated. Instead of seeing gamers as a sort of family, we’re viewed as bags of money waiting to be cashed in at the bank. The gaming industry needs to find a way to straighten its path, especially if major studios want to continue to be on top. If studios like EA don’t cut the crap and give the gamers what they want, the indie devs will eventually come out on top.
What do you think of the state of gaming in 2017? Add to the discussion below and let us know.