This Dumb Industry: In Defense of Crunch
shamusyoung.comWhen people who work in the games industry say they crunch because they're young men with poor boundaries who would do anything to make video games, and when industry veterans literally write PowerPoint decks about hiring suggesting targeting young men with poor boundaries because you can get them to crunch since they will do anything to write video games, I choose to believe their words. This lines up with external evidence.
Scheduling software is hard: granted! But do we see 90 hour crunch on every single shipping product in the US software industry? No, that's ludicrous. Do we see 90 hour crunch on substantially every shipping software project in, I don't know, the Japanese software industry? Oh we do! Curious! Does that industry also write schedules which assume crunch? I mean it sounds far fetched but no, literally in the design document written in the first week of a three year project there are exhortations about how understaffed we are (Why?) and how tight he schedule is (Why?) and how required heroic efforts are (Why?). And does the Japanese software industry hire people with poor boundary control and ruthlessly inculcate lower boundaries? Great Scott it does!
Crunch in the video game industry is not an accident. It is a policy. Do not work in video games.
I think you might be missing the point of the article. The metric for measuring a game's success is how fun it is to play. You can't plan fun, that only comes from having a playground and tweaking things until the game play feels satisfying. Note I use words like fun and satisfying - words that are all highly subjective, and can only really be ticked off when you have the game in front of you and can agree it was fun to play.
The metric for measuring the success of software on the other hand is directly proportional to the amount of functionality it implements, as defined in the requirement spec. These totally different goals for development are the reason the industries are so different, and why one is harder to plan for than the other.
You don't have to build the whole game to get started on testing how fun it is to play, you start with an ideas sandbox and go from there. A large part of modern game development is involved in building up art assets, you only need a minimal set of art assets before you start playtesting and refining ideas.
Not always true. For example the combat animation in the Arkham games - without it how do you test the combat is satisfying? Or the super cars in GTA, how to you test they break in interesting ways without having the finished mesh? Or the gun projectiles in Fallout - how do you tweak them to feel right without a good bullet mesh, physics, collision, particle effects and so on? Satisfying game play is just as visual as it is functional.
I said minimal, not zero. To use your Arkham combat example, you do need to build the character animations, but you don't need to build the Arkham city environment in which those combat animations will eventually take place.
To give a visual example, in Street Fighter 5 there's the training stage, which is very simple but ideal for testing character animations and game mechanics. If you were developing Street Fighter 5, wouldn't you build such an environment first?
You are saying then that the animations are 'minimal' in the Arkham games, which they are not. To make the animation as fluent as it is would require a lot of work and tweaking. Street Fighter on the other hand has a defined set of animations that are played independently of the attack animation, resulting in far less work but overall less satisfying visuals.
Finally, of course you can build the environment separately if there is no dependency on it.
> "You are saying then that the animations are 'minimal' in the Arkham games, which they are not."
I'm saying for playtesting they are the minimum you need to do the playtesting. If playtesting is a priority then you work on the elements important for playtesting first. "Minimal" is not a reflection on the amount of work that goes into getting up to the playtesting baseline.
The metric for success on any commercial software is long-term ROI, not "fun" or "functionality". This applies to both games and business software. Fun and functionality can't really be quantified in a way that's consistent or comparable across products.
Some of the most successful games aren't really much fun for the most active players, such as MMORGs and free-to-play mobile games. They seem to succeed more through triggering addiction and a sense of competition amongst players rather than providing an enjoyable experience.
A major point of the agile manifesto was that building any software to a waterfall spec results in unhappy users. Even enterprise software is iteratively designed these days (well at good places at least). Games are not special.
Agile is no magic bullet. Additionally 'change' means a very different thing when you are talking about in-flexible data (3d models, textures, animation etc), and usually means throwing the data away completely and starting over. This is a lot more work that adding a form field to a web page, or changing the business logic behind a query by moving some code around.
This^^^. Scheduling for some types of software is harder than others. I mean it's always a pain in the neck, but not necessarily intractable.
I used to work on shrinkwrapped tools for SQL Server and .NET developers, and over time got pretty good at scheduling releases with an ever narrowing window (that was never that wide to begin with), even accounting for the inevitable changes you'd need to make as you got customers to try out the software throughout the project. And you needed some degree of certainty over scheduling in order to coordinate with marketing, make sure sales and product support were trained, dovetail with other projects so that people could be reassigned, etc. Key point, as well: teams are cross-functional, but relatively small.
Line of business apps are trickier because you're often dealing with stakeholders who either aren't sure, or can't well express, what they need. You also have the kind of integration problems that occur much less often than with shrinkwrapped software. Cynically, I feel like a lot of unwillingness to schedule for these kinds of projects - especially for moderate team size/complexity - is really because people just don't like doing it, for a variety of reasons. And by schedule, I mean sit down and do the homework as it were in detail, not just throw a convenient date up on the board (I've never seen anyone do a good job of estimating a software project in the large this way, and it doesn't help the business make informed decisions when you end up shifting the date or cutting functionality under duress later on).
You get similar problems with public facing web apps: because you can roll out a new version of the software whenever you want this allows you to chop and change functionality as needed (or wanted), so things can grow tentacles. Team size and complexity can vary a lot here but, obviously, bigger, more complex teams make it harder to schedule. If you're rolling out functionality incrementally though, this may not present a big issue commercially.
Games are a whole other thing, because the feel of playing, and the fun factor are absolutely key. This is also why I don't think film development is a great analogue for game development. Film is not interactive, but games are, which adds a layer of complexity film doesn't need to worry about. And this comes whilst having a production team of a comparable size and complexity to that of a film for big budget games.
To be honest though, I'm not sure how much team size and complexity factors in with games - just watch the Indiegame film where you have very small teams in all cases, yet (particularly memorable with Fez) development takes massively longer than expected, and involves a rewrite to get the game right. If that hadn't happened it wouldn't have been as good a game.
Like I say, I think games really are a different kind of problem - perhaps the closest thing software project management has to an NP-hard problem. It's really no surprise they ship late, and that crunch is a reality. Perma-crunch though: well, that just makes me feel sick to think about.
The point of agile is for a team to achieve the maximum possible sustainable velocity. If you're building a custom line-of-business application, a web SaaS, or shrink-wrapped productivity software then you typically want to manage it as an ongoing program with multiple releases spread over many years. So a rational manager won't burn the team out on the current release because she needs to keep the team intact to work on the next release. Whereas with most games there's very little maintenance and enhancement work after the initial release, so it might be economically rational to overwork the team in the short term even if that pace is unsustainable.
If game development paid overtime, game scheduling would be as well developed as film scheduling is. When a film goes over budget, the director and producer usually get their share cut.
I've written about this before. [1]
Paying overtime would solve a lot of project management issues in tech companies. Overtime is for free so why try to avoid it? If people get burnt out after a while there are plenty of people who are happy to replace them.
I think this is the crux of the problem - in the US and Canada (at least) there is an IT class of professionals that are exempt from overtime. Why? Please someone tell me why?
And it's not just the gaming industry.
I am involved with 10 projects right now (way more than my usual load). 1 of them has a reasonable schedule (though my component is sort of aggressively scheduled), 3 have completely ridiculous schedules (from day one requiring overtime and weekend work to complete), 1 has stupid political nonsense going on which makes it a complete cluster that is making the timeline silly, 2 others are being rushed in phases, since the timeline was stupid to begin with, 2 others are crunched because they are related to the 3 that have stupid timelines (forcing the team to split in to 5 different teams and work in parallel) and the last one is being finalized but had some pressures on my component, which rushed my work.
So, anecdotally, 90% of the projects I am involved in currently have issues with their scheduling.
As I have been involved with software/hardware development since 1995, my experience suggests this is nothing new.
It's time for repeal of some Fair Labor Standards Act exemptions. There's one for computer programmers.[1] The threshold for exemption from overtime needs to be pushed up to the point that only the 1% are exempt.
I agree 100%. When on salary you are buying a piece of my time in exchange for an agreed upon amount of compensation. For an employer to make the former variable but not the latter is B.S.
Also, the whole more than 40 hour per week expectation is nonsense. I am independent now and can do the same (or more) amount of work in 30-40 hours a week now. In fact, I worked ~500 fewer hours my first full year of being independent. Did I make a little less money? Sure, because I don't get paid time off anymore. And I also worked a lot less.
I like that idea, also please take a cut from the salesman.
Neither film nor games have a "salesman" per se -- are you saying this just because you dislike salespeople?
If he says it for the same reason I'd say it it is because in my experience salespeople tend to provide "aggressive" estimates (e.g. You said 10 days, that's almost the same as 5 days, isn't it?) if they think they have better chances to make a sale doing this. So, maybe, cutting into their share would help to deliver a (IMHO) very important message: There are deals which aren't worth it.
And yet if it was the other way round, maybe our films would be better and actually make sense.
There is nothing that I agree with here. Crunch = Producer fail. Feature Creep = Producer fail.
The suggestion that "scheduling is hard", no sh*t Sherlock, that's why you don't hire a 20 year old Producer who did one title at this last company.
"It’s your entire job to say no to stuff like this [asking for more time]."
No, as a manager it's your entire job to make sure stuff like this doesn't happen. How did you let the project lead get into this situation? How irresponsible was it to basically gamble - not just with money, but with other people's time and careers - that the project would be on schedule? "Oh, but that's the way games get made." Right, so now you're just normalizing deviance.
>Scheduling isn’t just a problem in videogames, this is all kinds of software development.
And yet, somehow, videogames still manage to be worse about scheduling and effort estimation than pretty much any other kind of software system. Why is that? Why do managers of video game development teams feel like their industry is a special snowflake in software, that it's not bound by the same constraints as enterprise CRUD apps?
EDIT: Literally every point he brings up regarding feature creep and estimation applies equally to enterprise CRUD apps. So as bad as estimation for enterprise applications is, I never hear about people working 80-hour weeks for years on end to bring Widget Planner 3.55 (Now With Fruble Support!) to fruition.
I see a lot of cases of people saying estimation is impossible, which upon closer inspection turn out to be a different argument: "perfect foresight is impossible".
The Nirvana Fallacy, in other words. Because it's not perfect, it's worthless.
Of course perfect estimates are impossible. You need to make them anyhow. And if you don't make them explicitly, you'll make them implicitly, and they'll be worse.
That's not the problem. The problem is that each step in the process is variable, and it's not variable by a small amount but potentially up to a magnitude. That's the problem. The second problem is that the industry does not pay for reasonable estimates, they want imperfect unreasonable estimates which are very conservative with which to beat the employee or the contractor over the head with to force overtime and crunch.
> The problem is that each step in the process is variable, and it's not variable by a small amount but potentially up to a magnitude. That's the problem.
That's not unique to game development. Or software. Or any industry, really.
> The second problem is that the industry does not pay for reasonable estimates, they want imperfect unreasonable estimates which are very conservative with which to beat the employee or the contractor over the head with to force overtime and crunch.
Of course, but that's because the estimate is not being treated as an estimate. It's being treated as a plan, or a goal.
I read a book called Industrial Megaprojects in which the author, talking about civil and industrial projects on a gigabuck scale, lamented the same problems.
> That's not unique to game development. Or software. Or any industry, really.
Yes, it is. It's not literally unique to software, because R&D and "nobody ever made something like this" projects share it, but most industries don't.
In games we HAVE to push things forward, we HAVE to innovate or seen as a copy of an existing product. That being said, that's what "Pre-Production" is all about, answering the questions that are new to this project.
In the piece it says something about "needing more windows", that isn't what I'm talking about. That's just poor game design at the outset.
I was refused an interview once because they were looking for the exact person who had done a hit title, he "obviously knows what he's doing". The product in question was 2 years late and 1.4 million over budget. I said: "I can do that...". I think the game industry is unique in that we're often hired, or not hired, based on how famous the title was that we worked on last.
The US Navy developed evaluation and review technique specifically to solve the first problem. For a particular task you can say: optimistic time = 1 week, most likely time = 2 weeks, pessimistic time = 20 weeks. If you decompose a large project into a large number of smaller tasks then the critical path schedule estimate should be close to reality. Or if you want to be more sophisticated you can do a Monte Carlo analysis to create a probability distribution of possible completion dates, and then make commitments based on whatever statistical level of confidence makes you comfortable.
> Keep in mind that when you’re giving a quote like this, you are basically making a promise to get the job done for a certain price.
Favorite quotation from this article. So much insanity and stress flow from this.
But hey, this is a team. We're all in it together. If things go wrong, count on my support ... unless it means you'll miss your deadline. What kind of loser-idiot misses their deadline? I asked you how long it would take at the start! Why didn't you tell me six months ago it would take three additional weeks? Let's try this again: how long will it take to finish level 9? I think you know the right answer now, don't you?
All those words and none of them touch upon the obvious financial bullshittery of being expected to work as much as double time for the same salary.
When I started my career I worked for a University and then a private scientific engineering/manufacturing company. The latter was purchased by a public company. They cut 20% of the staff and expected the same output. This has been going on for 20 years.
"scheduling is hard"?? Yeah, if you have no fucking clue what you are doing I guess everything is hard!
Shipping successful products on time and in-budget is not rocket science! The right people are out there...you just have to hire for it.
It is not hard at all, but typically external pressures force people to agree to things that don't make sense. I have a client that has an event every year at the same time (let's say it's mid-July. They want projects to be completed before this event every year. But for some reason, they don't start planning these projects until let's say end of April. So every year it's the same crazy pressure and quality issues.
If only the planning for these projects started in say, October? But no, in October the team is still dealing with the fallout of the poorly planned projects of the current year.
You can take risks, but you must have a share of the rewards. Taking risks in exchange for nothing is not fair.
Seems like the solution to this is not to promise release dates, and fixed budgets.
Release dates aren't going anywhere as long as there's a traditional retail chain and marketing spends.
You can do a hell of a lot to just pay workers overtime - people in the film industry in LA may work overtime, but if they go overbudget it comes out of the show runners cut. In software overtime is just free time, and they'll squeeze you out like an orange and replace you with a new fresh eyed person when you can't go anymore.
> In software overtime is just free time, and they'll squeeze you out like an orange and replace you with a new fresh eyed person when you can't go anymore.
Thankfully, in many European countries software development is under the same rules as any other job, and one is entitled to get either free time or money form those extra hours.
Granted not all companies play ball, but then one can make them play ball, if they feel like getting some external help from either unions (we have IT unions) or lawyers.
The "management" exemption is often heavily abused to extend unpaid overtime. It happens at e.g. Rockstar and Jagex.
When I was younger, I decided that failures in schedules were failures in management. Now that I am older, I feel that failures in schedules are failures in management. Not failures in technology.
As patio11 mentions elsewhere in this thread, schedule crunch, except for the very first time, is a deliberate decision.
I'd never work for this guy
Nope, nope, fuck you.
Voluntary crunch leads to a culture of overtime. You won't be considered a "team player" if you decide that you don't want to spend weekends and extra hours in an industry where you make 40% less than your peers.
I feel even stronger about this after having a kid. There's no way I'm destroying the relationship I have with my family for your product. I saw it again and again when I was in the industry(at least 3 divorces on the last project I worked on) and I'll never go back.
Also there's solid data against crunch. I've posted the game outcome project before. They sampled hundreds of developers and found a very negative correlation with crunch.
He's true that voluntary crunch is less worse than mandatory but it still has a negative impact.
http://gamasutra.com/blogs/PaulTozour/20141216/232023/The_Ga...
This level of hate is unfounded - the author specifically condemns the kind of situation you're describing. He's arguing that teams must maintain a culture of normal working hours, to keep energy and morale high even into the project's final weeks, just in case a brief crunch becomes unavoidable.
Unless you're saying that you saw three divorces caused by a project where nobody worked any overtime until the last few weeks, you're attacking a straw man here.
Not quite, from the article:
> Perma-crunch is stupid and destructive, but voluntary crunch by people who are trying to push the medium is a good thing. Hard work is not bad or evil. Great things can happen when you’re willing to push.
The thing with voluntary crunch is it becomes institutionalized and part of your company culture. If you don't pitch in you're managed down and out since there's always some poor soul willing to do your job.
At least until they burn out in 3 years(look up the Gamasutra GameDev Salary, average tenure is just under 3 years).
That said some of the best people we hire are ex-gamedev.
In the industry you're trying to find the intersection of sane hours, good pay and stability. If you're in the top 5% of studios you might find that but a large majority of people never even see one of those 3 things.
>The thing with voluntary crunch is it becomes institutionalized and part of your company culture.
Just say No. What i like about voluntary crunch is that it is voluntary. At the current job we don't have them - it is a BigCo after all, life-work balance, you know. At one of the previous jobs we had semi-voluntary crunches. Well, i would just get up at the standard time, pack my laptop and say Bye! to all the guys who semi-voluntary stay out for the semi-voluntary crunch. Was rated as the top performer anyway :)
I come from USSR/Russia, the country which has been relying a lot on heroism of the people through various moments in its history. One thing becomes obvious when you learn all that history - heroism is usually required as result of preceding idiotism performed by some other people.
> The thing with voluntary crunch is it becomes institutionalized and part of your company culture.
That's exactly what the author argues teams must avoid. His premise is explicitly that a little crunch isn't harmful as long as it's not an institutionalized part of the company culture.
A little crunch is harmful too, just less harmful.
Crunch is rarely a necessity. If the deadline can't be moved, and you're going to miss it, work on cutting the workload (drop features, etc...). If the project can't have any corners cut, then move the deadline. If neither of these is an option, then do what you have to do and find new managers for the next project, because they should've been more on the ball and planned for potential slippage by giving a time buffer to avoid crunch.
I don't think the author would disagree. He's saying that if you avoid crunch 99% of the time then crunching for the last 1% isn't very bad and might even be good. He's not saying you can avoid 99% but the last 1% is inevitable.
> "the last 1% is inevitable"
I don't think it's inevitable at all. If you plan for slippage by not being too optimistic with time schedules then it can be avoided.
> I don't think it's inevitable at all.
I said the author is not saying it's inevitable.
I agree, and as a consumer of games I find crunch stupid and unnecessary. If a game needs another week just DELAY IT A WEEK. This idea that once marketing sets a date of completion that it's a total must-make is complete nonsense.
Except it isn't. An entire company could be jeopardized for missing eg a holiday deadline.
That's quite possibly true now, though I'd argue that is largely because game publishers killed the goose that laid the golden egg by shipping tons of totally broken games and ruining the idea of pre-ordering. Likewise I think that process could be reversed by showing fans that the company brass actually gives a damn about their product enough to delay it when it needs more time.
Nintendo is a fantastic example of this, I would say this happens to a good chunk of their titles, Zelda especially. Yes it's always a bummer, but Nintendo also hasn't had an Arkham City or a Division where the game comes out half finished, full of bugs, with the promise of "we'll patch it later."
On the other hand, Star Citizen.
It's wild profits undermine my point slightly, granted, but games like duke nukem forever suffered hugely from When It's Ready, poor expectation management, and final products ultimately not living upto the hype.
Duke Nukem wasn't a case of crunch though, it was a case of severe mismanagement and lack of any real direction. Crunch would not have saved it. The vast majority of titles given a delay of two weeks or even a month or two in extreme cases would not undermine sales.
The problem with crunch is similar to the problem with open plan offices, there's very much a post hoc ergo propter hoc fallacy that worms its way into people's minds. A studio will work on a title on a tight schedule, and they'll go into crunch, and get it out close to schedule and have a successful launch. Then they'll start to believe that crunch is good. Similarly, people in open plan offices will have serendipitous discussions that will result in some positive collaboration outcome or some sort, and they'll start to believe that open plan offices are good too. But this is just falling into the trap of thinking that the way it happened is the only way it could have happened. It also ignores all of the costs, including opportunity costs, of doing it that way. We don't have the ability to pop over to parallel timelines and ask our multiverse selves what the over under was on trying things a different way. More so, because we tend to be more averse to loss (another common rationality error humans typically make) we tend to view change that might result in "losing" something we see as having valuable as scary and risky, even if it would overall be beneficial.
Consider the costs of crunch. The quality of work diminishes. Morale diminishes. Stress rises, personal lives are negatively affected. People become bitter about their employers. People become depressed. And so on. And this is true even when crunch only happens sometimes, or even rarely. As a consequence of all those things sometimes you lose good people. Some of them seek greener pastures, and often these are the most talented and experienced people. Why? Because they have the easiest time finding work elsewhere and often they know how valuable they are and don't like being abused and misused. Especially if crunch involved doing work that wasn't a good use of their abilities and talents, which is very often the case. Also, sometimes people burn out, and either leave or stay while still being burned out, and you lose a lot of talent that way too.
All of this talent loss and destruction has a real, tangible impact on the company. It becomes harder to execute on things, especially challenging projects. But the thing is, none of this is objectively obvious right away, it takes time, often years, for it to become apparent. And because of the tendency for every project to be unique in its own way, it can be hard to pin the blame for diminished success on these things, unless you were already convinced of the idea already.
This is a huge problem, because it means that companies which crunch too much, which in game dev is almost all of them, are constantly losing high-tier talent as well as losing team cohesion (and talented, gelled teams is how you get shit done in software and in games). That means everything is operating all year round not only at reduced capacity and capability (lower development velocity, lower quality, etc.) but also at a less advantageous ratio of output to cost (talented devs and artists are worth vastly more than they are paid, as are gelled teams). And that's aside from all of the talented, experienced people who won't work for you because they refuse to tolerate the working conditions.
All of that adds up to much, much larger costs than the meager seeming advantage that occasional crunch gets you (and permacrunch gains you nothing).
The reality is that the reason why crunch is still tolerated is because there is too much eagerness to be in game dev. So many people view it as their dream job, and that means they tend to tolerate a lot of bad behavior from their employers, especially when they're younger. By the time they grow tired of it and move on or burn out there's going to be a dozen, a hundred, or a thousand other engineers just chomping at the bit to take that spot and fulfill their dreams too. A lot of managers in game dev don't see anything but headcount and dollar signs, and it's so much easier to con some eager beavers into crunching their way to ship something that's good enough than to take the time and effort to get it right.
> So many people view it as their dream job, and that means they tend to tolerate a lot of bad behavior from their employers, especially when they're younger.
Same thing applies to acting and production jobs in film/TV/theater.
We all wanna be where the magic happens.