In the past few weeks, complaints against microtransactions in AAA games have reached a crescendo, with anger directed towards EA’s Star Wars Battlefront II, Monolith’s Middle-earth: Shadow of War and more recently, Activision’s Call of Duty: WWII for their inclusion of paid loot boxes.
Criticism has also been levelled at online games such as Overwatch and The Elder Scrolls Online for their inclusion of loot boxes, but in truth, these arguments are misplaced. While the implementation of loot boxes within these games has somewhat normalised and acclimatised video game audiences to their presence, they represent a necessity for developers Blizzard and ZeniMax.
The fact is that online games require constant upkeep and monitoring, particularly with the frequency of griefers and cheaters, and developers require monetary support to ensure a happy and safe online community. As Rich Lambert of ZeniMax stated in one of our earlier interviews, following the release of The Elder Scrolls Online: Morrowind, “we have to keep the lights on. It’s not cheap to keep the game up and running.”
Rather than focus in on these complaints, we’d instead like to talk about newly implemented ‘pay-to-win’ loot box systems, wherein players who fork out are rewarded with faster game progression. This system has rankled many gamers, as it gives an unfair advantage to those with the capacity to spend more on games. While it would be easy to look at this problem in isolation – after all, microtransactions aren’t such a bad thing – the expansion of loot boxes in gaming represents a growing trend within the industry.
The inclusion of loot boxes in games such as Shadow of War and even NetherRealm’s Injustice 2, which feature only peripheral online features, has raised some eyebrows. Unlike Overwatch or The Elder Scrolls Online, which require constant updates and active development, these games include paid microtransactions as a seemingly unnecessary addendum to their already hefty initial cost.
Backlash against microtransactions has been loud and aggressive, with Star Wars Battlefront II developer EA being forced to adapt their loot box system amidst complaints about becoming a pay-to-win title. While it’s clear that microtransaction systems are yet to be accepted among the casual gaming audience, they may become a necessity without fundamental change in the industry.
Many have suggested that the advent of gaming loot boxes is driven by corporate greed; however, this interpretation of the issue reveals an ignorance of the current video game landscape. The rising cost of games development has been charted frequently and often, detailing the increasing complexity of modern video games, and the need for larger teams of highly skilled workers.
The fact is that modern video games cost hundreds of millions of dollars to make – even as high as $US265 million, if reports on Grand Theft Auto V’s development are accurate. Dedicated developers put years into producing the games that we enjoy, and despite the rapid development of gaming technology, price points for AAA games have yet to reflect the rising needs of the games industry.
Games cost the same as they always have, but with the obvious and dire need for structural industry change, is it time to reconsider the price points of games? In 1996, Australian gamers bought the classic Super Mario 64 from Toys R’ Us at a $59.99 price point. In 2017, gamers purchased the frankly spectacular Super Mario Odyssey for $62.00 at Big W.
Remarkably, when adjusting for inflation, the fact is that savvy gamers bought the newest Mario title for significantly less than the cost of Super Mario 64, despite the obvious and incredible advancements between the two games. Not only that, but the development team for Super Mario 64 was between 15-25 people, and given that Super Mario Odyssey was developed by the same 90-strong team behind Super Mario 3D World, it’s safe to say that their team was much more robust and required far more input than their 1996 counterparts.
The implementation of microtransactions within AAA video games can be traced to this fundamental problem — that consumers are still paying the same amount for video games that they were 20 years ago. Video games are expensive and complex to make, and at the average cost of $79.95, most retailers like Target and Big W are selling at cost price, or at a loss.
EB Games, in comparison, sells their titles at around the $99.95 mark, and given that their business is currently prospering, it’s safe to say that their audience is willing and able to pay more for their games. This begs the question — should developers be charging more for their products?
Having said that, Australian gamers, on average, are still paying far more for their games than their US counterparts, largely due to poor foreign exchange rates, parallel import restrictions and a lack of competition among retailers. If you were looking at purchasing the recently released Sonic Forces, in the United States, you’d expect to pay $USD39.99 from Gamespot, while in Australia, you’d be paying $59.95. It should be noted that Gamespot and EB Games are the same company. It would be fair to say that Australian consumers are already paying more than enough for their video games, however, it’s not enough to support the industry.
Any significant changes to Australian pricing would only mark a drop in the ocean of change needed to support the games industry, but it would initiate a much needed conversation. Australia is often seen as a ‘guinea pig’ for testing within the games industry, as we represent a smaller portion of the market, and therefore may act as a rudimentary case study to track any needed changes. Implementing a new pricing system would allow for the collection of much needed data within the industry, and an understanding of the willingness of consumer markets. These changes, however, should not be limited to Australia. There is a fundamental need for structural, industry wide change that Australia must play a part in.
For the average consumer, a price hike for games may represent a mere inconvenience – a few more dollars spent when purchasing the newest FIFA or Call of Duty title, however, it represents a more significant change for passionate and enthusiastic gamers. Games, as we mentioned earlier are expensive, but the cost is not unjustifiable. When considering the monumental skill and wide range of people involved in the creation of video games, their current price barely reflects their efforts. While pricing changes would mean a significant drop in the purchasing power of consumers, it would mean an increase in the quality of games, the stability of gaming jobs across the world, and the vast potential for growth within the industry.
While gaming audiences might baulk at the idea of paying more for their games, the simple fact is that the current video games industry is unsustainable, and it’s unreasonable to suggest that it goes on as it always has. Microtransactions are a paper solution to a problem that has plagued the modern games industry, and as a consequence, they’ve encouraged accusations of gambling and greed against developers only trying to keep their teams afloat.
In the current industry landscape, it’s becoming increasingly important to support the developers that you enjoy, whether they’re smaller indie developers or AAA studios. To ensure the quality and prosperity of the games industry, it’s time to rethink our approach to video games, their developers, and how much we truly value them.
How do you feel about microtransactions in gaming? Would you be willing to pay more for your games? Join the conversation on Twitter, or on Facebook.
———-