It is no question that the video game industry sees the most worldwide revenue out of any and all entertainment industries, outmatching cinema, music, books, et cetera, and that in recent years the industry has been adopting practices to maintain and increase their yearly revenues, all while pursuing new avenues for additional monetization. Video games, like really any product, are directly tied to money and the perceived value they provide in exchange for that sum, yet within the past decade, I cannot help but feel as if something is… off about the balance between the money spent on games and the value garnered from them. But before getting to that, I’m in the mood for a bit of a informal history lesson.
Part I: An Informal History Lesson
When they were first introduced, video games were predominantly featured in arcades, localized game centers designed to entice eager players, namely children and teenagers, with a variety of games offering some form of spectacle and required a form of skill. Whether it be from honing one’s reaction time in something like wack-a-mole, their aim with skee ball, or their ability to time shots to hit gradually descending space aliens that slowly but surely accelerated in speed as more and more of them were exterminated. They were attractions that offered players with an opportunity to entertain and test themselves in exchange for a relatively small amount of money, often just a coin or two.
These arcades proved to be quite popular given the novelty of the burgeoning industry of the time, and while the game themselves were relatively simple, with the bulk of them having the game loop over and over again with minor modifiers such as increased difficulty, there was a palpable sense of value seen in playing these games, and eventually, getting very good at them. The structure of these games relied on the player putting value in their achievements, most often represented by a score, and that’s exactly what happened, with many people priding themselves on getting progressively better at titles at time went on. However, getting good at these games would cost a lot of money spread out over an elongated amount of time, and it was that form of repeated business that many arcades thrived on, which in turn influenced the design of these games.
For as much a certain generation loves arcade games and looks back on them with a sense of adoration, they were ultimately designed as businesses, and every game featured in an arcade was there because the owners believed that they would allow the arcade to generate more revenue. This resulted in a lot of games that drew players in with a degree of spectacle, often going above what could be done on home consoles, which I’ll get to in a minute, but required the players to either express an incredible amount of skill or invest a hefty sum of coins to see the title through to the end. Each transaction was relatively small, but 50 cents here, a dollar there, and a person could easily spend $10 in a given afternoon at an arcade with nothing really gained from it beyond the thrills of then contemporary arcade gaming.
If my tone is not enough of an indication, I personally find the value of a lot of arcade gaming to be very low in comparison to the amount of money it often requires. But around the same time arcades were boasting video games as a main attraction, home consoles, personal computers, and dedicated handheld gaming devices were hitting the market, where they would offer eager players with games play indefinitely and hardware they could own. This should have caused arcades to suffer, but they still had many values that home gaming hardware lacked, namely the spectacle and accessibility, as they did not require people to invest hundreds of dollars into a system, accessories, and, of course, games. Now, this is where things get messy, as there really are three industry subsets that were introduced at this time, and all of them had wildly different pricing models. There were home consoles, handhelds, and home computers.
Handhelds were actually the most simplest model, typically being somewhere between $70 and $250 back when dedicated handhelds were still a thing, with games that fluctuated anywhere from $20 to $40 at launch, gradually moving towards the higher end of the spectrum over the 30 years they were relevant, with prices often fluctuating based on the level of pedigree or content each title boasted.
Home computers are probably the most complicated, as buying games back then often required placing mail orders from distributors or buying lavishly arranged packages that made buying a PC game feel like a major event, and went to add a lot more to the somewhat standardized launch prices of $30 to $70, fluctuating on a game by game bases. Yet even with the game, there came the issue of actually playing it on one’s several hundred or thousand dollar hardware, due to how this was before PC hardware standardized and entered the mainstream. Though, that itself is simplifying things a bit too much, as it was common to see budget games release for about $10 back in the heyday of cassette based gaming. Which sounds great, until one realizes how cassettes were prone to so many technical defects. Same with floppy disks.
Home consoles can themselves be segmented into about three eras when it comes to pricing. The cartridge era, the disc era, and the modern digital era. None of which are properly or rigidly designated throughout history, are this divide is more of a my own internalization of the information I casually parsed to form this article. Though all of them follow a very similar approach when it comes to the pricing of the consoles themselves, which have occupied the fairly consistent price range of $200 to $400 at launch, excluding systems that either were blatantly unsuccessful or quickly incurred a price drop.
What I dub the cartridge era includes every major console that used cartridges as their primary form of media, painting a timeline spanning from the Atari 2600 to the Nintendo 64. In this era, games cost somewhere between $30 and $80 at launch, gradually rising as cartridge technology became more sophisticated, and with prices being based on the technology behind each game. As in, games that required additional hardware in the form of expanded storage space, battery backups, or unique chips were more expensive than comparatively simpler games. While it may seem like games did grow more expensive over the era’s run, such changes can largely be attributed to inflation.
Meanwhile the disc era, which includes every major console that used optical discs, barring oddities like the 3DO or Sega CD. It can be considered to have spanned from the Playstation to a debatable end sometime before or during the generation of the Xbox 360, Playstation 3, and Wii. This change in medium brought forth a far more unified price model where just about every major release cost $50, due to how manufacturing an optical disc and plastic case is far less expensive than soldering a circuit board into a piece of thick plastic that is then shoved into a cardboard or plastic box. This remained the standard price until the rise of HD systems, which raised the standard price to $60 due to increased development costs and general inflation.
Now, both of these examples game prices tended to remain fairly evergreen. Games rarely were given significant discounts by stores due to the costs they had to pay to acquire them, and while games did steadily decrease as the years went on, it was a slow and steady process in most cases. Or at least until games started disappearing off store shelves. Once that happened, the price was determined by how many people were willing to sell their own copies, or how much a given retailer wants to get them out of the bargain bin.
So, I talked about price, but what about value? Well, the sense of value associated with games of these eras primarily came from their campaigns and how much players got out of it, and the time they were able to invest in it because if a game could be cleared and was done with in a weekend, then many saw it as only being worthy of a rental. This encouraged game developers to design games to keep players entertained for a longer period of time, as playtime in the arcade was largely dictated by how many coins someone managed to stuff in their pocket.
They achieved this through many methods, such as making games more difficult or cryptic, thus delaying a player’s progress in campaign driven experiences until they got good enough or bashed their head against a wall enough to find an elusive solution. Recycling content to elongate experiences via padding. Including unlockable extras and incentives for performing specific actions. By funneling more time and money into the game to make it longer. Or just making the game so gosh danged fun that the player is either satisfied by the experience as is.
But to simplify things into a single statement, the value of a game comes from the sense of enjoyment one personally receive by playing it and experiencing what the game has to offer. Naturally, value is subjective, and whether or not it a game is worth its stated price stems from how an individual would quantify the value of their experience with the game into a financial metric. A game is a good deal if the value exceeds the price, a game is a bad deal if the price exceeds the value, and so forth. With regards to games that are single purchase transactions, that concept is pretty straightforward, and the matter of assigning value to games was straightforward as well. Yet as the digital era drew near, things began to change.
Part II: The Digital Modern Reality
In the modern era, games often undergo a form of hyperdeflation as time passes, not necessarily with regards to its MSRP and default price, though retail prices did begin dropping faster as well, but with regards to temporary sales which dramatically discount games, and by being made available for such a reduced amount. In 2010 I first started taking note of how certain major AAA multi-platform releases are occasionally marked down to half of their launch price less than a quarter later, and that trend continues to this day, with rapid depreciation becoming something of a running gag amongst certain members of the gaming community. There are naturally still exceptions, with major Nintendo titles being evergreen with regards to both sales and price. However, as more and more time passes, the more reliable this trend becomes.
Theory 1 – Digital Competition: While digital game purchases and non-physical game distribution was around as early as the cartridge era, via services like the Sega Channel, the Super Famicom Satellaview, or BBSs if you want to consider those digital distribution, game consoles did not come with dedicated digital storefronts until the mid 2000s. These storefronts, Xbox Live, Playstation Network, Steam, and the (recently closed) Wii Shop Channel, allowed players to purchase smaller scale titles that simply lacked the capacity to succeed as full physical retail titles, along re-released classic games for $5 to $10. This subjected $50 and $60 retail titles to more competition, as for the same price as a new game, players could easily buy a customized collection of smaller titles that, to them, could represent significantly more value. Thus necessitating lower prices for retail titles to remain competitive.
Theory 2 – Laggard Rush: Game publishers, games media, and social media have cultivated a culture meant to instill a sense of FOMO (fear of missing out) within its the gaming enthusiast audience, incentivising pre-orders and launch purchases in order for players to experience that newfangled hotness that everybody is talking about. This in turn has resulted in most modern games making a bulk of their money at launch, with a game’s success often being determined based on launch week, month, or quarter sales, as they tend to fall off beyond that point.
Once this early adopter audience has picked up the game, sales can be rejuvenated by a sudden sharp discount that aims around the reservation price of a certain laggard audience who did not feel the game was fully worth its standard price. While this could technically be done years afterwards, games also need to compete in public mindshare, so it can be better to have major sales sooner than later. Which itself is just another side effect of the great almighty FOMO.
Theory 3 – The Rise of Digital Distribution: Game companies in general have been pushing for digital distribution for quite a while, as it is simply more profitable for them. It is an environment where they set the price, they control the message, and they receive a 70% or more of the sales price as revenue, without needing to worry about any of the expenses associated with producing physical copies of games. This in turn lowers the number of physical copies needed to be distributed along with associated costs of production, guarantees that the player will always be able to buy a game so long as the licensing agreement does not fall though, and provides a place for games to be advertised through sales banners and lists showing new or popular titles.
Theory 4 – Steam Set the Precedent: Back when PC gaming was on the rise during the late 2000s and early 2010s, Steam sales were held with a degree of reverence, as they saw such incredible and tremendous discounts that spurred people to buy games by the bushel, as they were so gosh darn cheap that it seemed foolish to not pick them up. This in turn made many publishers oodles of revenue with no significant expenses, was emulated by just about every other digital storefront, and became something of a meme to boot. Also, fun fact, contrary to what some may thing, heavy discounts apparently do not have much of an effect on non-discounted post-release sales. So, yeah, there really is no downside associated with this practice.
Theory 5 – Software Sales are Soaring: The rise in digital dependency that has been seen throughout the 21st century has inundated people with constant information about things to consume, purchase, and engage with, and has resulted in many things, such as the aforementioned rise in FOMO, but broadly speaking about game software sales, they have never been better. The core audience has enough buying power to purchase games regularly, and as such sales in general have been on the rise for a while, and have little indication of stopping (at least until the expected generational dip). The market is simply in a good place at the moment, shovelware and general fluff aside, and that helps support this growing trend.
As for the next theory… I think it deserves its own section.
Part III: The Rise of Additional Monetization Methods
The idea of expanding games beyond their initial release is not a new concept, as expansion packs and updates were somewhat common throughout PC gaming during the 80s and 90s. They built upon what the base game offered, adding new areas, mechanics, and story to cater to an audience that enjoyed the base game while recycling technology, and incentivising players with an asking price below that of a new game, often for around $20 or so.
In the early 2000s however, that mentality was heavily expanded upon with the rise of digital storefronts that allowed for smaller bits of tertiary game content to be distributed via the internet, with no real associated costs beyond the cut taken by the platform holders who operate these storefronts. This provided companies with incentives to create small and minor bits of downloadable content, such as bonus items, extra costumes, or small side quests, in addition to the occasional expansion sized addition. While this was generally looked upon as a positive, certain publishers had questionable practices, such as allegedly taking out part of the game before release to be sold as DLC, or simply overpricing cosmetic items, something that a lot of fighting games were, and are, notoriously bad at. This in turn led to the current microtransaction and Live Service driven industry that represents a large fraction of modern AAA gaming..
Getting back to the main topic at hand, the prices attached to DLC and the content offered has routinely struck me as being baffling throughout the past several years. Unlike game prices, additional monetization prices tend to be stable to a fault, as it is exceedingly uncommon to see a season pass or DLC price drop, and the discount applied to peripheral content during sales is typically a degree lower than the discount afforded to the base game. Though, this issue is largely negated with the existence of complete, game of the year, or ultimate editions of games that contain the base game with all DLC.
I believe the reason that DLC that was $10 a decade ago is the same price today has to do with a limit to the hyper depreciation I talked about earlier. While prices continuously tread towards the bottom, most games settle on a final MSRP price of somewhere between $15 or $20, as the prices there are so low that it is considered reasonable by a majority of the consumers. This remains true for smaller digital titles, as just about every game that launched for $10, $15, or $20 over the years has kept the same base price, only ever dropping lower in the form of a temporary sale.
This winds up creating a very… skewed sense of value when judging the value of full games in comparison to their additional content. To give a personal example, back in 2011 I purchased a physical two-game package containing Saints Row and Saints Row 2 for $20. This was not a sale item or anything, but rather a new SKU that was made prior to the release of Saints Row: The Third to help people keen on the latest game to catch up on the series thus far. It’s actually a fairly common practice, and considering how Saints Row 2 normally went for about $20 around then, it was a pretty good deal. Anyways, after going through Saints Row 2 and greatly enjoying the title, I grew curious and checked out the additional content offered for the game, which boasted two significant pieces of DLC that added in new missions, outfits, and story content that could be generously described as an extra 20% of the game content for the cumulative price of $17. A price that is reasonable for a $60 title, but was about as much as I paid for this officially distributed two-pack.
There are many more extreme examples of this dichotomy, just look at most of Koei Tecmo’s library, which has some of the most exorbitant DLC prices I have ever seen, but for content that, relative to the game, should only cost a fraction of what they do. Now, I know what the reason behind this is that customers do not think too much about spending $10 on something, and that due to how not a lot of people buy a given piece of DLC, the publishers must charge a premium on it to make back their development costs. Yet upon laying it all out, it seems like a bizarre system that people should identify as being unfair and having an inconsistent amount of value per dollar. But I suppose that most people do not think too deeply about transactions when it involves a small or obfuscated amount of money, which is further exemplified in the form of microtransactions, gacha games, and loot boxes.
A lot of AAA game publishers this past generation have been nudging towards multiplayer driven games under the pretense that they keep players engaged for a longer period of times, and as such, are more susceptible to paying more in order to elongate the value they get from a given game. This is often achieved through a number of smaller purchases made periodically, which causes the player to not think too deeply about they money they are spending. Or alternatively, they can exchange a large sum of money at once in exchange for greater rewards with “free” in-game currency, resources, or items.
This is seen everywhere from the latest sports titles, the biggest annual shooters, and the highest grossing mobile titles, and, well, I find this approach to be a little gross. It relies on people not fully thinking through the value of a transaction in comparison to the price they pay, and often encourage some form of gambler’s fallacy in order to keep players buying in order to pursue an illusive reward. A fallacy that many people are psychologically susceptible to, and have been manipulated into pouring thousands of dollars into a game in order to get more value from it, or obtain something valuable within the game.
What I find to be especially odd about a lot of these situations is how the games that spur this sort of behavior are free-to-play titles, which often offer several hours of value up front, and the better ones allow players to fully enjoy the experience of playing the game with no additional monetary purchases. They are effectively giving players free content in hopes that they will eventually cave into shelling out money, either in the form of a sort of tip jar, thanking the developers by paying them a morsel of money, or by going far in, dedicating themselves to investing into a system where, best case scenario, they spend hundreds of dollars on bits of intangible data represented by a PNG of a cute girl. Even though they could have easily bought an entire library of games filled with waifus aplenty for just a fraction of that.
I briefly talked about how microtransactions simply were never “worth it” to me in Fire Emblem Heroes when I reviewed the game a year ago, and looking over several examples, I continue to think that microtransactions are almost never worth it, as the asking price for a tertiary amount of content is often outlandish in comparison to the value games provide otherwise. Yet this approach is clearly working, with many free-to-play games and premium live-services generating revenue rivaling the retail sales of some of the most successful titles of the generation.
All of which has led to a peculiar dichotomy where games are cheaper than ever, but they can also cost people more than ever with the amount of overpriced ancillary content so many are adorned with nowadays. Yet, I cannot really falter companies for doing this. They have found methods of making additional revenue, and as a business, they want to scrounge up that revenue while complying with established laws… and maybe, possibly, theoretically, some form of agreed upon ethics. Yet the gaming community seems to have accepted this new monetization model, and is clearly supporting it. All of which I honestly find to be a bit… upsetting. Yes, video games were always designed as a way to generate revenue for companies, but now it seems so much more blatant due to the advent of additional monetization methods.
While it is admittedly up to the consumer to decide how they spend their money and the value it represents, I cannot help but feel that the prices and true value offered by many games of this era are so often obfuscated in a way that misleads consumers. I set reservation prices on all the PC games I want, log all of my monetary transactions into a spreadsheet, and have a bachelor’s in accounting, so I am very conscious of my purchase decisions and the value of what I am paying for. A lot of people, however, lack that financial savviness, and as such, are more susceptible to paying more than they need to or really should on a given game.
Oh, and the value present within modern games themselves? Honestly, it is easy to look at all this additional content available for some games and assume that the base game itself is of lesser value, but judging the value of games broadly is very difficult due to how subjective a lot of value is. If a game is greatly hampered by the exclusion of near mandatory features that are only available as additional purchasable content, then the value of the base game is greatly reduced. Yet if the base game has plenty of content that has a large degree of value tied to it, with additional content to purchase, then that does not mean that the base game is bereft of content by any means.
I think the bottom line and the overall point I think I’m trying to make here is that a lot of game companies are indeed trying to obfuscate the value of their products and confuse customers into getting them to pay more. Not due to necessity or choice, but because it allows them to perpetuate the cycle of growth that is demanded of most publicly traded corporations that exist in this late-capitalist society, because so many companies only really give a toss about their shareholders instead of their stakeholders, as is the American way. Nothing in lieu of a collapse or revolution could really change this system and all the end-user consumers can really do about it is try to be aware of these tricks, keep value in perspective, and practice frugality when it comes to entertainment purchases. It kinda sucks, but that’s life for ya.
Header image comes from the anime K-On!, because that is the best image reference I had for money.