Video games may have been a niche hobby many moons ago, but advances in tech and the drive to commercialise entertainment products have launched ‘gaming’ into the mainstream.
Total worldwide revenue of the video games industry is expected to be around $580 bn in 2030. Contrast that with the worldwide cinema box-office revenue of just $21 bn in 2021 and it is clear that gaming is now a giant in multimedia consumption.
The single highest grossing entertainment product of all time is Grand Theft Auto V, surpassing over $7 bn in revenue while the solo developer of the infamous mobile game ‘flappy birds’ was banking in around $50,000 a day in ad revenue at its peak in 2014; the game apparently only took two days to code.
With such a large financial and cultural footprint, regulation (inevitably) creeps in. In March this year, the Austrian District Court of Hermagor (the equivalent of a High Court in England) ordered Sony to refund a claimant who had spent around 300 euros on FIFA Ultimate Team packs. Players have the choice of either using in-game currency, earned by winning matches or completing challenges, to ‘unlock’ these packs or purchase them outright with their own money. The judgement stipulated this constituted a game of chance within the meaning of the Austrian Gambling Act, especially since there was a pecuniary benefit to Sony.
FIFA Ultimate Team has been around since 2010 and the recent tightening of the belt by regulators has forced EA (the video game developer of FIFA) to include the percentage odds of obtaining a high-rated football player before a user purchases a pack. This could be seen as an admission of liability, pending a larger decision by the European Courts. Indeed, there have been growing concerns over the impact of gambling-like features within video games on vulnerable users. FIFA is a Pegi 3 rated game, meaning the game is suitable for all age groups.
So how exactly did the gaming industry transform into a potential gambling one? Perhaps the starting point should be to look at the history of video game publishing.
Traditionally, games were released to the public as an outright purchase deal; you bought the disc, and you owned the product forever (unless of course the disc was re-sold on a third-party marketplace). This then evolved into the provision of downloadable content (or “DLC), where gamers could purchase additional content via an appstore/gamestore and install it onto their device. The user was only able to run the new content when they plopped the disc into their console or have already installed the full digital version of the game on their device.
More recently, however, developers have been publishing their games as free to play (or “F2P”), recouping revenue from discretionary purchases through an in-game store. Users can purchase virtual items from the store, such choosing their playable character model to look like Lionel Messi within popular F2P game Call of Duty: Warzone. Often labelled as the ‘live service’ model, users can download the game entirely for free, putting unlimited hours into their gametime, enjoying regular or seasonal updates where new content is added. Purchases from in-game stores are often dubbed ‘microtransactions’ (or “MTX”) since they are considered secondary to the main experience of the game.
Instead of software-as-a-service (or “SaaS”, this is ‘games-as-a-service’ and popular games like this include Fortnite and Counter Strike. It’s also not uncommon for games originally released as one-off purchases to turn into F2P games (for example, Sims 4), no doubt attracted by the allure of the MTX model.
So, as with many industries, a Schumpeter-esque creative destruction in play with developers required to ‘get with the times lest risk destruction!’. It’s the perfect justification – you get a free game, but we monetise the product as much as possible to earn a profit. This seems reasonably justifiable, but it is open to abuse.
Counter Strike: Global Offensive is a popular game which consistently has over 1 million concurrent players at any time, and has a prominent in-game marketplace and ‘lootbox’ function. Players can purchase a virtual key to unlock a virtual crate then a scrolling graphic appears (akin to a gambling wheel), eventually stopping on the virtual item won and the resulting win can be lucrative. At the time of writing, the most expensive item available to buy on Steam (the Sony PlayStation store equivalent from our FIFA example) is a knife for Counter Strike worth $2,003.30.
A cohort of online influencers have found a niche – livestreaming themselves opening lootboxes to audiences of thousands of viewers, capitalising on the rush of potentially finding a rare virtual item. These ‘streamers’ get thousands of pounds worth of viewer donations daily, despite not actually playing the game. Such streams are often broadcasted on ‘Twitch’ or ‘Kick’ – the latter allegedly owned by the founders of an online cryptocurrency casino, through the former has taken steps to restrict gambling streams on its site. The vast majority of viewers on such platforms are minors and young adults.
From a practical perspective, most games have a type of middle currency to be used to purchase in-game items as opposed to real money. Using the Counter Strike example above, I would have to top up my ‘Steam Wallet’ with my own money to purchase a potentially indeterminate number of virtual keys to unlock lootboxes before I eventually win the virtual knife (or top up my Wallet enough to buy the knife in full). There is currently no way to ‘earn’ such items in Counter Strike by playing the game, as with many ‘cosmetic only’ MTX models in live service games. Once funds are added to your Steam Wallet, there is no way to withdraw or convert it back to GBP.
Under UK law, the Gambling Act 2005 defines gambling as a type of ‘gaming’ which means playing a game of chance for ‘money or money’s worth’. Undertaking such activities without a license is a criminal offence and only persons over the age of 18 can participate. There has been doubt from the UK Gambling Commission that winning virtual items constitutes ‘money or money’s worth’. However, third-party marketplaces do exist, and it is not too difficult to sell a password to a bona fide bidder over the internet for an account containing valuable in-game items. After all, the Gambling Commission caveated their comments on virtual items by claiming this could be tested in the courts (the Austrians seem to have taken the first step). If the UK courts deem the MTX model as gambling, we’ll see games aimed at minors stripped of their lootbox functions and criminal sanctions imposed.
The UK government has promised a ‘major reform of gambling laws‘ to protect vulnerable users from the ease of access to application on smartphones which have gambling-like functions. Changes proposed include a review of online game design rules to limit the speed users can complete gambling actions, as well as looking at ‘other characteristics’ which exacerbate the risk of addiction. However, there is a lack of measures aimed at consumers of video games within the government’s proposals on gambling. Nonetheless, there is a good chance of this changing soon, when we look to the current rules and future trajectory of the regimes of online safety, consumer law, and data protection.
The GDPR has provisions protecting individuals from the use of personal data when said data is used to make decisions by wholly automated means and where such decisions have a significant impact on the individual. This often arises where organisations use artificial intelligence and algorithms to infer data about a person, or create a profile about an individual, to make a decision on, for instance, their mortgage offer. Although spinning a virtual wheel within a video game may not involve personal data (you would have to check the privacy policy for that), the fact a random algorithm decides what prize you get, and the fact there is a real danger to vulnerable individuals of addiction, means we could see similar principles applied to video games. Organisations which use automated decision-making already have a duty to provide individuals with a ‘meaningful explanation of the logic involved’ in such processes. Not too dissimilar to providing the odds of finding a high-rated player in those FIFA packs.
Echoing the recommendations from its European counterparts, the UK Officer of Fair Trading stated video games advertised as free should not mislead the consumer as to the ‘true costs’ involved in owning and playing the game and children should not be ‘exhorted’ to purchase or ask their parents to purchase virtual items from in-game stores. This could potentially extend to a mandatory disclosure of gambling-like MTX functions within video games and a ban of such products aimed toward children. After all, UK consumer law (via the Consumer Contracts Regulations 2013) does oblige vendors to provide customers with suitable pre-sale information, such as the main characteristics of the product being bought.
There has been an evident shift toward protecting vulnerable users of online products, strengthened by the proposals within the Digital Markets, Competition and Consumer Bill currently ploughing through Parliament. That bill, amongst other things, introduces certain trust and transparency objectives for the benefit of users of digital products. More specifically, vendors of digital products are prevented from requiring or incentivising users to use other products alongside the main provision of content. This could, for instance, prevent video game developers from requiring a user to purchase lootboxes as a way to gain advantage over their gaming peers.
Running alongside the Digital Markets bill is the Online Safety Bill which aims to protect children from harmful online content. Although focused on user-generated content hosted by, for instance, social media platforms, the Bill will introduce measures which could be transplanted in some future legislation to restrict gambling-like functions within video games. For instance, one of the proposals is mandatory ID and verification checks for users of online pornography sites to prevent minors from accessing illicit content. ID verification for video games containing lootbox functions could be on the cards.
China, which accounts for nearly 50% of the global market share for mobile gaming, introduced a curfew for individuals under the age of 18 preventing them from playing video games between 10pm and 8am. This was in addition to a ‘unified identification system’ where video game developers and platforms can verify the age of their users. Though not quite comparable to the UK, major legislative systems are identifying and suppressing video games containing exploitative and addictive mechanics.
The UK Parliament recently concluded that children under the age of 14 spend an average of around 3 hours a day of screentime, no doubt exacerbated by the recent COVID pandemic. Accessibility and exposure to potentially harmful content has increased exponentially in a more connected world, especially with the extinguishing of what sociologists call the ‘third place’. Within the examples mentioned above, therefore, anything that has an adverse effect on a vulnerable individual is likely to be caught by regulations (present or future) governing the use of online products and services. With the advent of a more regulated internet, it is likely we’ll see video games caught by future legislative regimes combating ‘online harms’.
Nabiel is a trainee solicitor currently working in commercial law and advises digital businesses on their business-to-business and business-to-consumer matters. He has a keen interest in the video games industry and keeps up to date with the latest regulations affecting the tech industry.