The War Between Web and Game Developers

Devs are devs, you say? Think again. There’s a fierce battle going on between web and game development over programming paradigms, best practices and fundamental approaches to work and life.

Written by Ari Joury, Ph.D.
Published on Feb. 17, 2021
The War Between Web and Game Developers
Brand Studio Logo

If you think that web development is an enormously lucrative field, you’re not wrong. In 2020, developers working on projects ranging from simple blogging sites to complex social media platforms in the United States earned $40 billion. The video game industry is similarly flush with cash, currently enjoying a market value of $60 billion in the U.S. alone.

Despite their similar sizes, the industries couldn’t be more different. You can book competent web dev services from a fairly skilled high schooler. The skillset of a game developer, however, is way more advanced. Instead of creating a bunch of static sites (there are dynamic ones too, but they’re rarer) in two dimensions, game developers must build an extremely dynamic and responsive set of three-dimensional scenes that often need to obey the laws of physics. Sara Chipps from StackOverflow expresses the contrast as similar to that between a neurosurgeon and a dog walker.

With this contrast in mind, you might think that game developers, equipped with enormous capabilities, run the world while lowly web developers have nothing of importance to say. Although game developers tend to have more expansive skills than web developers, they’re nowhere close to running the world — or even getting paid adequately for the hours they crunch. Web developers, on the other hand, don’t have it that bad.

The outlook on work and life couldn’t be more different in the two areas. Most web devs work in pretty stable jobs, have regular hours and receive appropriate pay for the workload. Many game devs, on the other hand, not only crunch ungodly hours, but also face constant job insecurity and meager pay in relation to their skillset. In addition, web devs can talk at eye level with game devs when it comes to programming paradigms. Which paradigms are the most important varies in each area, but at the moment the trends in web and games are diverging. Best programming practices also vary between the two industries, but that’s natural, given that they tackle very different problems. Ultimately, web and game developers are almost in two different worlds.

 

Game Developers and Work/Life Balance

If you’re a neurosurgeon, why would you go back to walking dogs? Well, even neurosurgeons get burnt out sometimes. Game developers, as it turns out, get burnt out a lot. And if they don’t get burnt out, then job insecurity stabs them in the back.

Most game development studios depend on external publishers. These publishers deal with many different products, and video games are just one item on their lists. As a result, they are often fickle when dealing with contracts, canceling months-long studio projects with no notice.

The result of this uncertainty is twofold. On the one hand, many game developers feel  Damocles’ sword over their heads, going to work every day with the possibility that it will be their last. Worse, they wouldn’t have been fired because of any wrongdoing, but because their publisher changed its mind. So, game developers live with a tremendous amount of uncertainty in their day-to-day lives.

On the other hand, despite this uncertainty, game developers and their managers want to convince the publishers not to cancel the contract. To do so, they need to show that they’re excellent collaborators. So, deliverables must get delivered every time, no matter what. If that means that members of the development team are still working their rear ends off at 11 p.m., so be it. If everyone needs to work Saturdays to get the work done, so be it. If developers don’t have time for a life outside work because they’re barely keeping up on eating and sleeping as it is, so be it.

Although not every gaming studio is made the same, pressure, crushing overtime and job insecurity are often the norm. Therefore, it’s not surprising that some game developers leave the scene to be better able to care for their families and themselves.

 

When Gamers Go Back to the Web

If game development requires so much skill, you might ask, why does it lead to so much burnout and job insecurity? The reason is relatively simple.

Becoming a game developer is many people’s dream. Making something that tells a story, entertains and inspires people is an extremely tempting idea.

We have seen the downsides of the field, however: extreme pressure, unpaid overtime, job insecurity. Even worse, in contrast to web development, it’s quite difficult to get a job as a game developer because the talent pool is large in relation to the number of available jobs.

Counterintuitively, all that stress isn’t compensated by a big salary. Most salaries for game developers start around $60,000 annually for entry-level developers and top out around $125,000 for senior developers. In comparison, web developers tend to earn somewhere in the range of $54,000 to $103,000. Factor in the job security of the web, and it’s quite understandable that many people will choose the less stressful path.

Therefore, the gaming industry is not a place in which many developers who have families to feed or who are at risk of burnout will thrive. Those who do have family demands or can’t work a zillion hours each day often drop out of the industry and, despite their massive skillset, become web developers instead. The software is simpler and the jobs are stabler in web development, so even if the transition can feel like moving into a whole new world, it can be a decision with many benefits.

 

Programming Paradigms Are Shifting

Why do we need to think about programming paradigms? After all, there are many no- or low-code tools for both web and game development. At some point, however, developers can’t get around code. For web developers, this might happen when they need to manage databases connected to a webpage or build a site’s core logic. A game developer might need to code to clarify the game’s logic or to build AI components. Especially as this practice gains traction in gamers’ circles, coding will be impossible for game developers to avoid.

Programming paradigms, meaning theories on how to write code, abound in every area where software rules the game. Since these paradigms place different levels of importance on aspects such as speed, readability and maintainability, the dominant paradigm varies by area.

For video games, age-old object-oriented programming remains the norm. This makes perfect sense based on how video games operate. For example, a player of a certain game might be an instance of the “Player” class. Any zombie, vampire or shark might be an instance of the “Monster” class. This type of programming is intuitive, and, at least for most games, flexible and expandable enough to add upgrades and new features later on.

Object-oriented programming is still relevant to web development too. The broader field is trending toward functional programming, however. This paradigm is particularly useful for many pervasive applications like big databases, parallel programming and machine learning.

Particularly when web devs deal with extensive database queries or other back-end development issues, using elements of functional programming is increasingly en vogue. By contrast, game devs can only really make use of it when implementing AI components. When the rest of the game is written in an object-oriented style, that might not be worth the effort in many cases.

This all contributes to the growing divergence of web development from games. Although the most popular paradigm is still object-oriented programming in both areas, the trend toward functional programming in web development means that the two worlds are slowly drifting apart. Sooner or later, there will be little common ground between the two areas, which contributes to the rift that’s forming between them.

 

The Tough Transition From Web to Games

Some programmers become web developers as a first step toward the end goal of game development. Given that this trajectory goes from lower- to higher-skilled work, you might be tempted to think that this is the classical path for getting into games. The reality, however, is far from that!

Although it’s true that transitioning from web to games is possible, a few obstacles make that career path tough. For example, a programmer might want to build a game where players can jump around and fall down when there is no surface to catch them, like in real life. They might proceed to code all of this from scratch. This process is extremely tedious and time-consuming, and it’s therefore quite likely that the programmer gives up before delivering anything.

Since these things have been done many times before, a better practice is using the in-built functions in Blender, Unity or whichever gaming engine is in use. This saves time and increases the probability that the final result gets delivered. The problem is that many developers unfamiliar with games don’t know about these fixes. This is just one of many obstacles to becoming a game developer.

Another differentiating point for games is how projects are set up. In “normal” development — which includes web development — most things are code-based. To execute a project, you’ll have to call a main() function or an equivalent of that. Not so in game development, however. There, everything is based on scenes in which you define how the application starts, which dependencies it has and so on.

Web developers tend to be unfamiliar with scene managers and scene graphs when they try to start in games. A scene manager may be translated to a software design pattern for web devs, but a scene graph, which specifies the relationships of different objects, doesn’t have a real equivalent. Therefore, switching between pure code and scenes is more than changing some vocabulary around. The two concepts are so different that they require completely different ways of thinking.

In summary, a web developer switching to games doesn’t only need to overcome a few obstacles. They also need to acquire a totally new way of thinking about software. This requires lots of hard work and passion and more than just a few weekends of study and practice.

 

Best Practices for Whom?

Having a set of best practices is always useful. They become all the more important when you’re dealing with complex processes or collaborating with others. Obviously, these exact best practices vary both by industry and individual companies within an industry.

In web development, best practices often encompass principles such as keep it simple, don’t repeat yourself, and code responsibly. These guidelines apply to all programmers to some extent, of course. They’re particularly important in web development, however, because nobody wants websites that are overly complicated, that have multiple components with the same structure but are defined in different places or that are poorly documented and tested.

All of these ideas are important to games too, but gaming principles take them a step further. Gaming best practices include simplify, fix bugs immediately, and code transparently. After all, nobody wants overly complicated code that slows the game down, or a buggy game or a messy codebase in which no one knows what’s happening where.

Therefore, the principles of web development are equally relevant to game development. But games take them a step further and refine them because the complexity of the matter requires it. Game developers, therefore, tend to be so versed in best practices that they leave web developers behind. This constitutes another element of the divide between games and web.

 

Gaming: The Fort Knox of Development

In terms of size, the web and game industries are quite similar. In terms of complexity and required skills, the games industry wins. This comes with a downside, though, as game programmers often need many hours to debug a game because it’s so complicated.

Despite the burden of job insecurity, unpaid extra hours and less-than-adequate pay, many developers decide to pursue their childhood dream. Since competition is fierce, this means risking many losses. In this sense, game developers seem like idealists who would do anything to achieve their goals.

Web developers, on the other hand, are way more pragmatic. They face an easier and safer route to a decent salary by choosing a job that is not too taxing and where competition is substantially less intense.

Although some people manage it, gaming seems to be the Fort Knox of software development. For getting in, you’ll need an extensive skill set and an understanding of your clients’ needs. It can be done if you really want it – but you can rest assured that there will be some difficulties along the way.

Stay Plugged InHow Partnerships, Not Rivalries, Are Fueling Innovation in Gaming

Hiring Now
Cohere Health
Healthtech • Software
SHARE