"The Last of Us Part II" bodes ill for the future of the games industry
Has the modern video game become unsustainable?

“Because survival is insufficient,” reads a line from Emily St. John Mandel’s “Station Eleven,” the quote itself lifted from an episode of “Star Trek: Voyager.” In the novel, the quote itself is scrawled across the lead caravan of the Travelling Symphony, a ragtag band of actors and musicians striving to entertain whoever they come across with song and Shakespeare in the decades after humanity’s collapse.
“Station Eleven” is a pandemic novel, a fact that both surprised and scared the shit out of me after I jumped into it blind, having been assigned to read it for an English class. The book has seen a spike in sales ever since the novel coronavirus was declared a pandemic back in March, and millions of readers like myself have stepped into the post-apocalyptic world Mandel lays out on the page and found ourselves unable to put it down. Despite being about a strain of flu eerily similar to ours (despite the fact it has a much higher mortality rate and kills its host within 48 hours), which you think would be the last thing you want to read about when you already have to live in this insane mess of a world on a daily basis, I was hooked, and sped through the book with much more enthusiasm than is usual for my reading assignments.
What caught me most was the book’s description of its broken environments, from the lush nature that has overtaken highways, airport terminals and hotels, to houses left abandoned, their stories told by the things left behind and inevitably looted. Mandel nails what a world 20 years removed from electricity and industry would look, feel and sound like. There’s a scene where main character Kirsten and a scouting party treks into an abandoned school in search of instruments, and after absorbing its raw details, the “sunlight pour[ing] through a hole in the ceiling,” “the floor strewn with broken glass, unidentifiable garbage, the remains of binders and textbooks,” I couldn’t help but think this reminds me exactly of “The Last of Us.”
The post-apocalypse of “The Last of Us” is a decidedly more intense one, with humanity’s downfall coming at the hands of zombie-like creatures infected by a mutant fungus rather than a virus. The game follows Joel, a smuggler tasked with escorting Ellie, a teenage girl who is immune to the infection, across the country to a rebel militia known as the Fireflies. While the world of “The Last of Us” differs a little from “Station Eleven” in that the power grid didn’t completely give out and the world’s supply of gas didn’t quite go stale, Joel and Ellie still find themselves exploring the same sort of abandoned schools, hotels, homes and hospitals as the characters of “Station Eleven,” albeit killing a lot more people on their way.
I first played the remastered version of “The Last of Us” five years or so ago, and the game still remains a gripping tale of survival and found family in 2020, captivating audiences much in the same way that “Station Eleven” did. My enjoyment of the first game had me quite excited for the sequel, which finally hit store shelves after multiple delays in June of this year. The pre-release trailers for the game looked brutal yet oh so alive, the character design and animation putting even the realism of developer Naughty Dog’s phenomenal 2016 output “Uncharted 4: A Thief’s End” to shame. Hype for me and others was through the roof. That was, until the leaks.
I never looked at the content of them, but in late April when major plot details were posted online, my Twitter timeline was aflame. I couldn’t tell if it was serious fans of the series being upset at the direction the game had taken, good old “gamers” decrying the game as social justice warrior bullshit or legitimate criticism that the steps the game had taken to be more inclusive felt shoehorned in. Regardless, a dark cloud brewed over “The Last of Us Part II.”
Even though it received outstanding reviews (and on the flipside got review-bombed on Metacritic by our favorite vitriolic gamers), I kind of forgot about it. It wasn’t until reading “Station Eleven” and feeling nostalgia for the first game that my impulsive brain felt inclined to pick up one of the many preowned copies from my local GameStop. I was ready to look past the controversy, see for myself what the game was all about, explore the lush environment of overgrown Seattle. I was going to write this article for the Daily about playing games about a pandemic in a pandemic. I would share it on Facebook and Twitter and a couple people would read it. Three dollar paycheck.
I couldn’t get through more than five hours of “The Last of Us Part II.” I’m not sure whether I can judge if the game is good or bad, because the story was going in an interesting direction and the characters were well-written, but I just literally could not force myself through the game. It is not fun. In what I realize is an attempt to be a meta critique on senseless video game violence, “Part II” is so endlessly violent that I need more than two hands to count the amount of times I had to look away from the screen in my short playtime. No, I do not want to agonizingly slit this militia member’s throat. No, I do not want to blow open the cranium of this infected with a weighty shotgun blast. No, I do not want to watch someone get battered to death with a golf club.
Around every corner it felt like violence was lurking. The game is so manipulatively tense that I couldn’t even enjoy a lucid segment of Ellie covering “Take On Me” on guitar for her love interest Dina without feeling like someone was going to come around the corner and kneecap the both of them. The game was going to scar me before I finished it. I went again to GameStop and got my money back within the seven-day return period.
Yet despite all the guts and gore, something more nefarious buzzes over “The Last of Us Part II” like a fly circling a rotten corpse. In a revealing report from Kotaku’s Jason Schreier, he details Naughty Dog’s brutal culture of crunch and how it’s working employees to the bone. There’s a million things I feel I could talk about with respect to “Part II,” from the game’s world design to its inclusion of a trans character who isn’t just set dressing, but I also feel like I cannot in good faith interrogate a game that was built on the backs of overworked and underpaid developers. Video games have long been my favorite pastime, but as the years go on I see them less as fun diversions and more as the microcosm of a whole host of problems. As the games industry marches towards the ninth console generation, in the blind pursuit of more graphically lifelike games than ever before, it’s got me questioning the sustainability of video games as a whole.
***
“i want shorter games with worse graphics made by people who are paid more to work less and i'm not kidding,” reads a tweet from user count jordula back in June, which garnered almost a hundred thousand likes. I first saw this declaration attached to an image of a developer stylized in MS-DOS visuals, the exact type of graphics to which the tweet aims to return. I saved the image into my camera roll immediately.
It’s no secret that video games have become big business. According to a recent report by games analytics firm Newzoo, over 2.7 billion people will spend $159 million on video games in 2020, with this annual revenue figure on track to pass $200 billion by 2023. And these numbers come as no surprise: consoles and PCs cost hundreds, sometimes thousands, new releases set you back sixty bucks a pop and most games nowadays further increase their revenue stream with DLC, in-game purchases and microtransactions.
Gone are the days of the budget double A game. No longer can you waltz on down to GameStop and get a used copy of “ATV Offroad Fury 2” for a cool twenty dollars, play it for a few hours and get your money’s worth. Maybe you’ll revisit it every month or so, who knows. No, nowadays it feels like games either go one of two ways: the epic triple A (verging on quadruple A, honestly) single-player blockbuster with a forty-hour heartbreaker of story that cost hundreds of millions to make, or the to-the-point indie that was made by a team of five guys somewhere in England that you can play with your friends for countless hours.
It’s no secret which of the two gamers have been choosing lately; just look to the recent popularity of titles like “Among Us,” “Phasmophobia” and “Fall Guys” for your answer. The most played games on Steam continue to be legacy multiplayer titles along the likes of “CS:GO,” “Dota 2,” “Rocket League” and “Team Fortress 2.” Can the same be said for some of 2020’s biggest releases, like “DOOM Eternal,” “Ghost of Tsushima” and “Cyberpunk 2077”? They sell a few million copies, most people either beat it or stop playing it within a week or two and wait for the next big thing to come out. And what do they play in the interim? “Super Smash Bros. Ultimate.” “League of Legends.” Old reliable.
Has the formula for the modern video game become unsustainable? Looking at Metacritic’s all-time Top 100, about thirty of them were released in the past decade, with only nine of them coming in the past five years. Yet theoretically video games should be the best they’ve ever been, with the industry having the widest pool of resources it’s ever had. We’re out of the uncanny valley, with next-gen graphics promising the most lifelike rendition of Master Chief’s ass ever witnessed. Today’s video games rest on the shoulders of years and years and years of game development and innovation, and can learn from the successes and failures of yore to understand plain and simple what people like to play. We’re far from the early PSOne days, before the right stick was universally used for camera controls.
Yet quite honestly I would much rather play “Ape Escape” before you could ever sit me down and get me to play “Marvel’s Avengers.” And that’s not a slight against modern developers either. The people behind today’s games certainly know more than I and work their asses off to make their games look and feel amazing, but that’s exactly the problem. They’re rushed to meet the studio’s deadline, working twelve hour days and weekends to make the hair wisp perfectly around Ellie’s ears. They get the job done, or they get out. And sometimes they find themselves working on a level for weeks only to find out that part had been scrapped without their knowledge.
I’ve had a lifelong passion for video games ever since my grandparents got me a PS2 for Christmas at way too young an age, and I’ve always been interested in what goes on behind the scenes. I remember writing emails to Nintendo in middle school to the effect of “How is a video game made?” and giddy with excitement when I got a customer support response back. If not for my brain’s complete intellectual refusal of everything math and science, I would love more than nothing to find myself responsible for programming some of my favorite games. I even bit the bullet this summer and tried my hand at a “Pokémon” ROM Hack.
I didn’t even get close to finishing the first area. If I walked away from one thing after that experience, it’s that game programming is tough work. Infinitely rewarding too; being able to walk around a house with furniture that I placed and a door that I coded to open and close properly reigniting that pre-teen fervor. But I could see how it would take me months to get even something close to functional up and running. And then you consider that the modern video game, a product infinitely more complicated than 16-bit assets, is dreamed up, created and shipped out in a matter of years, and it makes you appreciate the process a hell of a lot more.
I don’t know where I stand on “The Last of Us Part II.” Despite the horror stories of Kotaku’s report, all the employees interviewed made it clear that they were more than proud of what they were working on, even if it took a couple years off them. I just hate that crunch has become the standard, and that the increase in graphical power and demand for realism is setting up the industry to implode on itself. I look at the lineup of upcoming PS5 games and eagerly await titles like “Deathloop” and “Final Fantasy XVI,” but I fear for the man hours behind them.
I try to name my favorite titles from the past few years, and the list I come up with consists mainly of short and sweet indie titles like “A Short Hike,” “Touhou Luna Nights” and “Anodyne 2.” Sure, there’s big budget games like “Uncharted 4,” “Super Mario Odyssey” and “Persona 5 Royal,” but none of those games quite had the emotional resonance of the final cutscene of “A Short Hike” or the triumph of beating Reimu in “Luna Nights.”
And when I come to think about it, I remember I only ever played the original “Last of Us” once, never to touch it again. The games I listed I could play for hours and hours, much like the days where I would stay up past my bedtime replaying “The Legend of Spyro: A New Beginning” or running races on “Gran Turismo 4.” I drove forty minutes to Royal Oak to pick up a CRT TV just so I could get the experience of playing the PS2 classics all over again. I don’t quite think I’m itching to see a horse get blown up by a landmine in “The Last of Us Part II” in 4K anytime soon.