by [name redacted]
Part nine of my ongoing culture column for Next Generation. After the popularity of my earlier article, I pitched a companion piece about companies that had lived past their remit, yet technically were still with us. On publication we lost the framing conceit and the article was split into five pieces, each spun as a simple bottled history. In turn, some of those were picked up by BusinessWeek Online. Here’s the whole thing, in context.
A few weeks ago we published a list of five developers that made a difference, helped to shape the game industry, then, one way or another (usually at the hands of their parent companies), ceased to exist. One theme I touched on there, that I got called on by a few readers, is that although in practical terms all the listed companies were indeed defunct, several continued on in name (Atari, Sierra, and Origin), living a sort of strange afterlife as a brand detached from its body.
This was an deliberate choice; although Infogrames has been going around lately with a nametag saying “HELLO my name is Atari” – and hey, why not; it’s a good name – that doesn’t make Infogrames the historical Atari any more than the creep in the purple spandex with the bowling ball is the historical Jesus. (Not that I’m relating Infogrames to a fictional sex offender – though he is a pretty cool character.) The question arises, though – what about those companies which live on in both name and body, yet which we don’t really recognize anymore? You know who I’m talking about; the cool rebels you used to know in high school, who you see ten years later working a desk job, or in charge of a bank. You try to joke with them, and they don’t get a word you’re saying. You leave, feeling a mix of fear and relief that (as far as you know) you managed to come out of society with your personality intact.
The same thing happens in the videogame world – hey, videogames are people; all our sins are handed down. This article is a document of five great companies – that started off so well, ready to change the world – that… somehow we’ve lost, even as they trundle on through the successful afterlife of our corporate culture. And somehow that just makes us miss them all the more.
Activision
In 1979, the year of Asteroids, four of Atari’s brightest stars got fed up with the mill. Amongst each other, David Crane, Bob Whitehead, Alan Miller, and Larry Kaplan were responsible for over half of Atari’s home cartridge sales – yet were allowed no credit, no royalties, no individual recognition for their work, work that was quickly becoming recognized as a unique form of expression, and that clearly was of at least some value to their employer. So, as tends to happen, they revolted. With the aid of a record executive named Jim Levy, who provided a solid background in the promotion and marketing of artists, they set up their own game studio, named Activision.
Atari went nuts, screaming of conspiracy and espionage; the Activision team played it by the book, leaving empty-handed and reverse engineering the Atari VCS themselves, to produce their own original games – thereby becoming the first third-party console developer in history. At the time, it seemed like a revolution. “The video game business went from absolutely zero designer credit to something approaching rock star promotion.”, David Crane said in an interview with Good Deal Games. “We wanted to create an environment where if a game player enjoyed the ‘writing style’ of a particular game designer, he or she could look for the next game by that same author and not be disappointed.”
By 1982, Activision’s stock soared, and it responded in a very modern way: by buying up all of the smaller companies it could find. At about the same time, Activision started to branch out into home PC development. The theme of “developer as author” had gained a strong foothold in the American game industry, inspiring fledgling companies like Electronic Arts and Interplay, and kicking off a sort of fledgling “games as art” movement. And then came the crash. Larry Kaplan quietly resigned and returned to Atari, complaining of monitored phone calls, that nobody was interested in hardware development, and that he simply wasn’t having fun anymore.
Though Activision weathered 1984, all of the recent acquisitions and ventures left its coffers lower than they might have been. Stock plummeted, and two more of the original team – Miller and Whitehead – again jumped ship, to form Accolade. “We owned stock, but the VC’s got the controlling interest.” Whitehead explained to Digital Press Online. “We were insiders, so selling stock was a no-no, but the market had turned and our stock was a tenth of what it was… and morale wasn’t so good.” In the late ’80s and early ’90s, Accolade would pick up much where Activision left off in its original mission (including its fight-the-system attitude toward console development).
Two years later, CEO Jim Levy (a huge self-avowed fan) offered to buy out the struggling text adventure developer Infocom, noting the similarity in culture and mission between the two companies. The two companies quickly came to an agreement, and Levy promised to leave Infocom more or less alone to do its own thing. Six months later, in response to further losses, Activision founder Jim Levy was kicked off the board and replaced with Bruce Davis, the one board member who had opposed the Infocom merger. Over the next three years Davis is credited with slowly strangling Infocom to death by demanding a stream of new, disposable software in place of building up and maintaining a steady back catalog, as had been Infocom’s practice. The updated schedule and costs, and lower budget, left Infocom little room to keep up with Activision’s demands, let alone the graphical adventures being published by companies like Sierra. In 1989, after twelve quarters of loss, Infocom was shut down; about half its employees were offered jobs within Activision. Five agreed; the rest stayed in Massachusetts out of disgust.
Two other events occurred around the same time as the Infocom merger. One, Levy’s original premise of individual credit was becoming harder to enforce due to the greater workload required by any individual game. Two, the internal Activision mindpool grew all the smaller as programmer Garry Kitchen formed his own independent studio. Due to the recent implosion of the industry, Activision was happy to minimize costs by moving development off-site. Then, when Bruce Davis took over Levy’s role as CEO, Pitfall! developer David Crane left his parent company to work with Kitchen, leaving Activision with none of its founders intact. In a recent interview with GamaSutra, Crane explained that “Activision became the giant of the early eighties by recognizing that a game is a creative product and requires a creative environment. Bruce Davis’ biggest mistake was treating video games as commodities, rather than creative products. I only mention this because it explains why I could no longer associate with the company.”
For a few years Kitchen’s studio continued its long-distance relationship with Activision; eventually Activision chose to drop its end of the tether, leaving Kitchen and Crane to dub their company Absolute Entertainment, and try their hand at self-publishing – resulting in such objects of mass bewilderment as A Boy and his Blob.
The same year Davis’ Activision dropped its relationship with Absolute, and around the time it shut down Infocom for good, Davis chose to transition Activision away from videogames as its major focus, putting emphasis instead on more general business applications. The overall company name changed to Mediagenic, while Davis retained the Activision brand for its console game ventures. One curiosity to come from this period is Mediagenic’s publication of Cyan’s seminal CD-ROM adventure game The Manhole, setting Cyan (and the CD-ROM format) up for later success with its follow-up Myst. Overall, though, this decision was perhaps the biggest disaster in Activision’s history. In 1991, Mediagenic filed for chapter 11 bankruptcy. Two years later, a former 4Kids executive named Bobby Kotick picked up the pieces, renamed the company Activision, and moved it from northern to southern California, shedding much of the company’s staff in the process.
By this point Activision had ceased maintaining its own internal development team, in favor of acting as publisher to outside development teams – occasionally purchasing those teams outright. To this end, some of Activision’s first high-profile work under Kotick – Zork Nemesis and Pitfall: The Mayan Adventure – was produced by Zombie Entertainment. Later purchases would include such respected indie developers as Raven (Hexen) in 1997, Neversoft (Tony Hawk) in 1999, Treyarch (Spider-Man) in 2001, Infinity Ward (Call of Duty) in 2003, and Toys for Bob (Star Control) in 2005. Of note is that nearly all acquisitions have occurred since the debut of the Sega Dreamcast – in other words, within the now-passing console generation.
As CEO, Kotick took, and takes, pride in how detached he is from the game industry, seeing his disinterest in the material of the industry as a strength from a business standpoint. As Kotick told US News and World Report, “I don’t play video games. I’m not a techie. I’m the capitalist, the guy who knows how to take all the fun out of game playing.” To Kotick’s credit, he has indeed brought a sort of focus and determination to a company that had been struggling under some of the least effective management in the industry, transforming Activision from a struggling relic into the second biggest independent game publisher in the US, after EA. For that matter, Kotick has done pretty well by himself; in 2002, Fortune named him as one of the “40 richest under 40”, estimating his worth at $120.8 million.
At the same time, Kotick’s focus on business for the sake of business, with the act of videogame production almost an afterthought, has put Activision (and Kotick himself) in some precarious situations. Most recently, as reported on this site, the company has come under the scrutiny of the US Securities and Exchange Commission. Allegedly Kotick cashed in all of his stock options, worth a few hundred million dollars, after Kotick and other “top executives” back-dated their stock option grants to before a recent plunge, so as to increase their value.
Likewise, in the wake of the “EA_Spouse” issue, complaints have arisen about working conditions in Activision’s subsidiary studios. Claims of 90-hour weeks for years on end and unpaid overtime have made the rounds. This past May, Activision was struck with a class action suit claiming unpaid wages and punitive damages. Again as reported here, EA recently settled its suit with Erin Hoffman, suggesting some precedent. Overall, amongst all the suits and investigations and allegations, these days it can seem – especially to the casual observer – like Activision is hardly ever out of hot water.
Today’s Activision is hardly a monster, of course. In a warm light you could view it as a holding company for some of the best, most well-loved talent in North America. Still, there’s a big breach between the focused, idealistic team of the early ’80s and the shell corporation of today – namely in that today’s Activision doesn’t really stand for anything in particular, and neither does it really produce anything. Its principals lie simply in profit, and it leaves both production and personality to its scattered acquisitions, on which it relies for that profit. That it has such good taste in acquisitions is, I suppose, at least some consolation. Its lack of vision, however, leads to some curious situations. See Kotick’s recent New York Times interview where he muses that “full downloadable games [are] so far in the future that it’s almost incomprehensible.”, despite the success of Steam and Xbox Live, the experiments of companies like Telltale Games, and Nintendo’s pending Virtual Console.
Still, whatever Activision does tomorrow, and however irrelevant it might be today, its legacy was etched in alabaster over two and a half decades ago. Activision will always be the first third-party company, it will always have started the movement for individual recognition, it will always have been on the forefront of the argument for games as human expression. That it never got to complete its mission, and that it’s taken twenty years for us to find our way back, is beside the point. In the end, its founding principles will outlive even the company itself.
Electronic Arts
After the last section, the story of EA is going to sound kind of familiar – just amplified and a little more driven. When Trip Hawkins founded EA, he did it under the then-novel premise of an independent publisher; EA would run no internal studios, would produce no development of its own. Instead it would scout out, publish, and distribute the work of outside developers, operating under the early Activision principle of promoting programmers and designers nearly as much as the games they developed.
The name itself (based in part on United Artists) is telling; EA existed to proselytize the burgeoning art of electronic games; to act as a popular outlet for the voiced, yet scattered and unheard “software artists”. If anything, EA was positioned as – from a certain perspective – an improvement upon Activision’s founding ideals, out record-labeling the record producer even down to the packaging. Whereas Activision served to broadcast the names and statements of its own – of the disgruntled superstars of bestsellers past – Trip Hawkins wanted to dig up new talent; to act as a sort of equalizer so your future Richard Garriots would have somewhere to turn. And hey, if those future talents happened to hit it off and make EA a bundle of money – then… well!
Indeed, EA started off well enough. In 1982, Hawkins left Apple Computer (formed in part thanks to Atari; see “Five that Fell” on this site), taking along several of his coworkers to staff his new venture. The initial plan, later put fully into gear by Larry Probst, was to sell directly to retailers – again an unprecedented idea – rather than work through a third party, the idea being that, as a professional conduit of other people’s work, EA needed the best profit margins and market knowhow in the business. The tradeoff was that EA promoted its artists to the teeth and shared a large chunk of the profits.
Beyond promising, EA’s initial 1983 software lineup has become legendary: Archon, Pinball Construction Set, and M.U.L.E., along with the successful Donkey Kong knock-off Hard Hat Mac and a lesser-known worm-training game.
For the next few years, EA would continue much in this vein, offering its distribution services to other publishers (EA didn’t put out enough games to maintain its channels alone) and irritating figures such as Richard Garriot due to what he perceived as a focus on showy marketing over quality product. Regardless, through the mid ’80s EA would release such landmarks as Seven Cities of Gold, The Bard’s Tale, Starflight, and Wasteland. EA began to experiment with licenses, especially celebrity-based ones – especially sports-based ones. And then… something strange began to stir. Despite all its early proclamations, EA began to get the artistic itch itself.
The result, in 1987, was Skate or Die! – a sort of a cross between Summer Games and 720ï½°, taking the olympic format of the former and the stylization and catch phrase from the latter. And it wasn’t bad! Getting ported to the NES by Konami a full year before 720ï½° hardly hurt, either. Speaking from my own experience, all the kids just assumed it was the NES port of their favorite arcade game from last summer. Surreptitious, though not altogether undeserved, win for EA. Since no one really complained, EA took silence as a blessing and forged ahead. Slowly, over the next few years, EA began to move more development in-house.
Around 1990, EA decided it wanted a part of Sega’s new console, and yet – like Accolade (and Atari Games with the NES, before them both) – they didn’t care for Sega’s licensing fees, preferring to manufacture their own cartridges. A few legal threats later, EA and Sega worked out a deal, though EA continued to manufacture its own oddly-shaped cartridges – and it is within these cartridges, combined with EA’s experience with celebrity sport licenses, that EA hit its true goldmine. Repackage essentially the same game year after year, adjusting its roster and adding a few mechanics to stay current, and sell it for full price over and over – it was a whole new model for game publishing. Both disposable and mass-market; buy it one year, you have to buy all of the upgrades if you want to stay current.
In 1991, before this circus got on the road, Trip Hawkins left to follow a new muse (one the world would soon know as the 3D0), leaving his company in the very rational hands of Larry Probst. At the time it seemed a reasonable business decision; Hawkins was more a game developer than a businessman, and after a decade his original ideals were making overt growth harder and shareholders impatient – so Probst knew exactly what to do.
No sooner was Hawkins out the door than the acquisitions (and Madden milking) began. The first came in 1991, with Stunts developer Distinctive Software (later to become SSX and NBA Live studio EA Canada). A year later, fate frowned on Origin; Probst picked them up and set ’em under the eye of Distinctive founder Don Mattrick, who took to the laid-back Origin staff like a schoolmaster, applying a strict discipline whether required or not, sometimes to the benefit and often to the apparent detriment of Origin’s work. Some staff began to draw conspiracies, or at least detect a conflict of interest, due to the shared resources between Origin and Mattrick’s own EA Canada: the less for one, the more for the other. The battles here are many and famous, Mattrick enforcing a policy of big blockbuster games over small, reasonably profitable projects, then holding the blockbusters to an impossible schedule.
The strained relationship with Origin presages a decade-long streak of what could easily be called ruthlessness. In 1995 EA bought out Bullfrog, another wunderkind-led brainstorm house; within two years, its chief Peter Molyneux left to start a new studio, complaining of similar interference; by 2001 Bullfrog was closed.
After Bullfrog, the acquisitions went into rapid fire, at least one a year, nearly every year: Lost in L.A. developer Manley & Associates (1996); Maxis (1997); Westwood and Tiburon (1998); pioneering online developer Kesmai (1999); Dreamworks Interactive (2000); former Sega Sports studio Black Box (2002); racing game developer Studio 33 and PC port master NuFX (2003); Criterion (2005); and three more studios this year: JAMDAT Mobile, Mythic, and DICE. Tallying in spin-off studios, that’s an average of 1.2 acquisitions a year since 1991 – half that, within the current hardware generation. Of the total, one-third have since been closed – all, again, within the past five years. All save Origin, Bullfrog, Maxis, and DICE have been renamed after their parent company, either in part or in whole.
The pattern to these acquisitions, if not universal, is infamous: find a company who made a really popular game, acquire the company and its properties; set the team on overtime churning out sequel after sequel to the game in question, until the staff leaves or burns out, or one of the products sells poorly; then close or restructure the studio, occasionally firing everyone on-board. Developers and properties become as disposable as this year’s Madden. Of all EA’s acquisitions, only Maxis is known for retaining its autonomy and culture within the EA corporate structure, the jewel in EA’s crown.
Still, dark clouds, silver linings. Although EA has gone nuts in recent years – having abandoned all of its founding principles and developed an attitude of rapid growth for growth’s sake, whatever the long-term cost, thereby setting a poor standard and a poor example for the rest of the industry and contributing greatly to the mess that is the American game industry today – at least now we know what to look out for.
Thanks in part to Erin Hoffman and a high-profile class action lawsuit, that overtime business in particular (mandatory 90-hour weeks, years at a time, no compensation) has come under close scrutiny, bringing Larry Probst’s EA into a critical light. When, therefore, EA decided to buy up 20% of one of its closest rivals, freaking out Ubisoft, everyone’s eyes were turned to EA, glaring, almost daring them to take the next step – a step which has, as yet, not manifested itself.. And when EA chose to pull the rug out from any and all possible competition to its Madden brand, with its exclusive NFL, ESPN, and college football deals, again people coughed on their corn flakes and took notice. Nothing’s happened yet, as legally EA has done nothing wrong – though the ethics at play have now become a hot discussion topic.
There is a balancing force to the world. When things get a little bad, people hunker down and wait for better times. When things get worse, they say that’s just the way things are. But when things get ridiculous, people tend to get angry and to react. We’re nearing that stage, and I think the game industry in general is in due for some really significant structural changes that will directly impact the way it will function in years to come. When the way we make videogames changes, the way we think about making them changes alongside. And when the way we think about videogames changes, videogames themselves will start to change. And if all these changes reflect a new focus on the lives and health of the people behind the games, then that humanity will tend to come out in the end product – making for a livelier, healthier game, and game industry. When we get to this point – where we can take care of ourselves and our own – the industry will be effectively mature. And then that is when the magic will happen.
LucasArts
At about the same time Trip Hawkins was deciding what to name his company, George Lucas – hot off the success of his first superstar collaboration with Steven Spielberg and steeped in post-production to the concluding chapter of his Star Wars trilogy, decided – what with his new high-tech special effects and CG houses – it wouldn’t hurt to branch off into this videogame stuff that Activision was making such noise about, calling it a creative medium and talking about its designers like rock stars or film directors. Following his whim, Lucas pulled together a few talented programmers and artists, talked to Atari about a development partnership, got Epyx on the phone to help publish, and in May 1982 founded Lucasfilm Games.
By 1984, the team had a couple of games to show – Ballblazer and Fractalus, both basically action-oriented affairs, and both leaked to pirate BBSes shortly after Atari received its unprotected review copies. Despite being widely available for months before release, the games sold pretty well; Lucasfilm hired some more staff and set to work on a second wave – this time without Atari’s help.
Amongst the new staff was a recent graduate named Ron Gilbert, who did some minor work on the Commodore port of Koronis Rift – one of Lucasfilm’s two releases for 1985, alongside The Eidolon – before setting to work on his own project. Inspired by ICOM’s classic Mac game Deja Vu, Gilbert began sketching out an adventure game set in a haunted house. While Gilbert was working out the logistics, eventually writing his own scripting language – “Script Creation Utility for Maniac Mansion“, or SCUMM for short – to ease along the design, other Lucasfilm staff readied their own adventure game based on the David Bowie vehicle Labyrinth. Aside from Lucasfilm’s first stab into the graphical adventure genre, this game marked the first bleed-over between the company’s movie and game divisions. Also around this point, a sub-group within the company started to fiddle around with flight sims.
In 1987, Maniac Mansion arrived and changed the direction of Lucasfilm for years to come. The game would come to be ported to every platform under the sun, including a TV sitcom; for the next five years, Ron Gilbert would come to oversee much of Lucasfilm’s output; and in one form or another, Gilbert’s SCUMM system would become the backbone of a whole era of game development. The next year, Gilbert and his team released their follow-up, Zak McKracken and the Alien Mindbenders – featuring supporting illustrations by a certain Steve Purcell.
Another year on, Lucas wanted a tie-in with his conclusion to the Indiana Jones trilogy, out in theaters that summer. Continuing to attract new talent, Lucasfilm hired a couple of guys named Tim Schafer and Sean Clark. Then in 1990, everything started to come together. George Lucas consolidated his spin-off companies – ILM, Skywalker Sound, Lucasfilm Games – into a holding company called LucasArts Entertainment. Brian Moriarty stepped in to develop fan favorite LOOM, then vanished again. And Ron Gilbert collaborated for the first time with Tim Schafer and writer-programmer Dave Grossman, almost accidentally resulting in a pirate-themed comedy-adventure called The Secret of Monkey Island. By this point, LucasArts had gained broad recognition as the master of its field, rivaled only by Sierra in the graphical adventure genre – which by this time was, to an extent, almost synonymous with PC gaming.
The next couple of years passed as you’d expect: a Monkey Island sequel by the same superstar team (usually considered even better than the first game); in lieu of an actual movie sequel, the successful Indiana Jones and the Last Crusade game received a successful game-only sequel. Between the two sequels, ILM and Skywalker Sound were consolidated into Lucas Digital, leaving the game division alone with the LucasArts brand. And then Ron Gilbert jumped ship, to form his own company.
Undaunted, the rest of the LucasArts staff stepped up to fill Gilbert’s boots. Grossman and Schafer continued their partnership, devising out a sequel to Gilbert’s Maniac Mansion. After years of incidental contributions and little in-jokes, Steve Purcell stepped forward and designed a game around his long-time characters, Sam & Max. Both games were smash hits, as usual. And then that same year, 1993, suddenly LucasArts decided it was a good time to base a game on Star Wars.
Probably inspired by the success of Origin’s Wing Commander, the first few games all followed in the tradtion of LucasArts’ popular flight simulators, putting the player in the cockpit of an X-Wing or a Tie Fighter. In these early days of the CD-ROM, the third such game – Rebel Assault – became something of a showcase game, much like Myst and The 7th Guest. The game’s success came in large part due to its incorporation of real (albeit grainy) footage and speech from the film trilogy, a huge novelty at the time. Suddenly, the writing was on the wall; LucasArts knew where the money lay, and that was in milking nostalgia. It was at this point that Dave Grossman left, to pursue a freelance career.
Carrying on alone, in ’95 Schafer delivered his final 2D adventure game, the comparably low-profile Full Throttle. Meanwhile, Brian Moriarty popped back into focus long enough to finish off The Dig, an absurdly long-delayed project involving Steven Spielberg and Orson Scott Card. Again, despite the high profile of the contributors, the game kind of sailed under the radar. One game that didn’t was Star Wars: Dark Forces – one of those new-fangled first-person shooters, wrapped up inside The License that Sells. The message seemed clear enough: adventure games are out – especially if all the top designers were leaving – and Star Wars games in all the trendy genres were in.
In the wake of Tomb Raider and Quake, and the sudden explosion of 3D cards, the PC game landscape changed almost overnight. Where only a few years before the PC was riddled with deliberately-paced, software, geared toward a more cerebral demographic, now the PC was at the forefront of graphics and sound technology, and the fast-paced gameplay that came alongside; young graduates of the Nintendo generation were attracted to the Miyamoto-inspired designs of id Software and its copycats, and persisted in ignoring all the stuffy, cerebral games that had put them off PC games in the past.
Catering to this crowd, LucasArts released a sequel to Dark Forces that took full advantage of the new graphics cards; it was another smash hit. At the same time, a second Monkey Island sequel was released, to rave reviews. Nevertheless, it would be the final game developed with Ron Gilbert’s SCUMM engine, and the end of LucasArts’ 2D adventure line.
Taking one last stab, in 1998 Tim Schafer tried his hand at a 3D adventure game in the classic LucasArts style. Though universally praised, and often cited as one of the best computer games ever made, the game – Grim Fandango – was sort of a flop. In 1999, The Phantom Menace was released to theaters, sending Star Wars furor into the crazy place. After some frustrated attempts to get a PS2 game going, Tim Schafer – the last of the Guybrush trinity – left in 2000 to form his own studio. That same year, coincidentally, some of the remaining adventure staff tried out a new 3D Monkey Island game using the Fandango engine. It didn’t go down so well. And that was that: the final LucasArts adventure, and the final nail in the coffin.
In 2002, someone at LucasArts noticed that the company was producing almost nothing save Star Wars products – and that it was pumping them out so quickly, the quality was beginning to suffer. The company pledged that from that point on, at least fifty percent of its releases would have nothing to do with Star Wars. Some of the old adventure staff got to work on sequels to Full Throttle and Sam & Max; despite some progress, eventually both games were shelved and many of the staff departed to form Telltale games. Later, David Grossman and Steve Purcell would join their former associates at Telltale to hash out a new market for PC adventure games. Before it ever got started, the fifty-percent promise fell by the wayside.
LucasArts continues on in its way, continuing to rely almost entirely on Star Wars though at least exploring the property in novel ways, through games like Knights of the Old Republic, Star Wars Galaxies, and Star Wars: Battlefront. And that’s all nice, I suppose, for people who are already obsessed with the property. For that matter, I’m sure the company has never been more secure or profitable. If you don’t give a damn about Star Wars, though, you’re… kind of out of luck.
On the upside, much of the original LucasArts talent remains active – and actually, some of the work happening at Double Fine and Telltale is amongst the most promising and progressive stuff happening in the American game industry. Maybe this course of events was inevitable. In retrospect it seems kind of bizarre that LucasArts waited a full decade to even produce a Star Wars game, and that over that decade they only bothered to tie in three games to current films. It’s weird that the company was basically allowed to run on its own, develop its own culture and in-jokes and audience and identity, completely apart from the films produced by its sister company. Logistically speaking, it would seem an insane waste of resources and potential. And yet there it was – it persisted, for fifteen years, its own little bubble. Then reason caught up, business took over, and the individuals were left to find their own way.
Rare, Ltd
Though for the purposes of this article it’s entirely a coincidence, 1982 seems like a sort of holy nexus for the game industry; it’s the year of EA, the year of Lucasfilm Games, and the year the Stamper brothers, then in their early twenties, began to tackle the ZX Spectrum. The Spectrum was an odd system, a phenomenon in Europe (especially the UK) yet completely unheard of here. The best analogy I can come up with is that it served as a parallel for our Apple II – except even more mainstream. Therefore, where we got Sierra and Origin the Brits got the Stampers, buried under two levels of pseudonym (“Ashby Computer Graphics” for the company, “Ultimate: Play the Game” for the public brand).
A curious phenomenon of the Spectrum market is that whereas Apple software tended to stem from Dungeons & Dragons (through one path or another), the UK stuff tended to be based more in an arcade sensibility. The Stampers, coming themselves from an arcade background, were right at home with the hardware and the UK development scene. They digested the hardware, found exploits that made for interesting game concepts or visual approaches, and over three years proceeded to put out a nearly unbroken strain of mega-hit releases (by British standards) – fourteen, by the end of 1986. And that’s not even counting their experiments with the Commodore.
Over that period the Stampers worked eighteen hours a day, seven days a week, supposedly taking only two Christmas mornings away from their screens. Indicating a Lamborghini, in a famed late ’80s interview, Tim Stamper explained “If you want that, you have to work to get it. I don’t feel it’s any good having engineers who only work nine to five, because you get a nine-to-five game. You need real input.” The result of that hard work, and the ridiculous release schedule of original, well-designed software that came out of it, was something of a Beatles-scale fandom. Yet the Stampers’ work schedule was such that they simply had no time for interviews or public appearances, adding to their mystique and – if anything – helping to make them the most in-demand development team in Britain.
And then the Stampers did something nobody quite understood – they sold off their “Ultimate” brand, and dropped the Spectrum like an old shoe. Even when the Stampers laid it bare, years after the fact, still no one back home really understood what they were on about. The thing is, back around early 1984 the Stampers got ahold of Nintendo’s new Famicom hardware, and were completely taken aback by it. They were convinced that, before long, it would become a sensation both in Japan and in the US, where Nintendo was planning to market it. In a burst of enthusiasm, they bought all the software available for it, and immediately set to reverse-engineering the system. They formed a secret subdivision of ACG called “Rare” to focus on Nintendo development, while Ultimate held up the public front, continuing its diligent Spectrum output.
By early 1985 the Stampers had hacked out the Famicom and had begun to write software for it; they brought some of that work to Nintendo, as a proof of concept – the first Western developers to do so. Nintendo was sufficiently impressed to hand over the official documentation that the Stampers didn’t even need at this point, and an official license to produce for the system – albeit under a unique “freelance” scheme. Whereas traditional publishers, under Nintendo of America’s “quality assurance” standards, were only allowed a certain number of releases per year, Rare was allowed an unlimited budget, provided they could find a publisher. And indeed, by the late ’80s Rare – by now the official company name, as it was in the Nintendo business for good – was publishing more games per year than anyone in his right mind would suggest: from six to fifteen to seventeen separate releases.
The curious issue here is that, whereas the Spectrum was a smash only in the Stampers’ corner of the world, that corner is also one of the few places left largely unaffected by the Nintendomania. So around the time Super Mario 2 was hitting shelves (alongside Rare’s own R.C. Pro-Am and Cobra Command), British magazines were starting to wonder what happened to their wonder children, and why they were wasting time on this strange Japanese box that nobody had ever heard of. When Tim Stamper explained that the Spectrum was over, that it was a dead end and that Nintendo was the future, his peers thought of Donkey Kong and scratched their heads. When he explained that there were ten million Nintendo consoles in Japan, and that the system was also a runaway success in America, people took the statement much as you’d take statistics on Aibo sales. It just didn’t register, or make any sense.
Still, the Stampers knew true appreciation lay in the sales numbers – and through a ridiculous five years of productivity, sales are what they got – albeit spread across dozens of small, experimental games and ports, rather than through any one or two smash hits. From 1987 through 1991, Rare released forty-four games; two for the newfangled Game Boy, the rest for the NES. Of those, four (including sequels to such Rare staples as the Wizards & Warriors and Jetman series) were actually produced by the Pickford Brothers, of Zippo Games – the only British studio Rare chose to work with. Twelve were wholly original; fifteen were sports or film, TV, or comic licenses; eleven were ports of arcade, PC, or even pinball games. Even the ports and licensed games, however, come off as carefully-chosen experiments; games like Marble Madness allowed the Stampers to show off their command of physics and isometric graphics (two of their trademark proficiencies). Pin Bot let them employ unprecedented split-screen graphical tricks.
At the time, the Stampers didn’t seem to much distinguish between original and licensed projects; they were fans of videogames in general, happy to see games that were made well and were successful – in part because then they could break down those games to study what made them so entertaining. The results of this research are pretty obvious; see their port of Marble Madness, then see Snake Rattle ‘n’ Roll. Far from resent the grunt work, they put their own stamp of identity on it then took what they could for their own use. “There’s nothing wrong with moving one step at a time,” Tim Stamper has said. “And that’s exactly what we did: we paid our dues by producing a lot of conversions in the early days.”
In particular, Tim and Chris Stamper were impressed with Japanese games – with which they felt some common roots, taking arcade action then making it deeper, longer-lasting for play at home. At the time, Chris Stamper expressed some frustration that none of their peers in the UK seemed to “get it”; that there was a bigger world out there, that nobody was bothering to study. It was almost like, despite all the obvious talent around him, no one was even trying to break out of the ghetto. “Britain’s got the best talent without a doubt. We should be producing the number one games, and it’s not happening.” To that end, the Stampers tried to serve as sort of a role model for the entire British game industry, drawing out the merits and appealing qualities of Nintendo’s games and trying to instill a rigid work ethic that they felt was necessary to compete on a world stage. As related in a 1988 Games Machine interview, “it is only through examining Japanese-made games and then putting the theory into practice through their painstakingly built contacts that they have reached the point they have.”
Although this bootstrapping attitude has indeed been responsible for much of Rare’s success, it has also caused friction between the Stampers and their peers; after producing a couple of games for Rare (to little compensation), Zippo Games fell on hard times. Rare bought out the Pickford brothers and set up their studio as “Rare Manchester”. Although their relationship had been amicable at arm’s length, the Pickfords soon found working directly under the Stampers more than they could bear. After the cancellation of a Game Boy wrestling game that one of the brothers was particularly proud of, the Pickfords picked up and left. And it was about then that life began to get a whole lot more complicated for Tim and Chris Stamper.
1992 marked three years since the release of the Sega Genesis, one since the Super NES. As with the Spectrum before it, the Stampers had pretty much maxed out what they could do with the original NES. That meant it was time to, once again, start thinking hard about the future. Although they ported a few Battletoads games around, it was clear the Stampers had missed leading the charge this time. After a brief period of angst – they hadn’t climbed so far to turn out “just another developer” – the Stampers decided the best way to stay ahead was simply to manufacture their own revolution.
Over the NES era Rare had accumulated a fair pool of cash, thanks to all of those silly ports and licensed games. Now it was time to invest those profits, in the most advanced computer technology around at the time: a bank of SGI workstations larger than any in Britain. Midway had recently made waves with Mortal Kombat, which used digitized footage of actors instead of hand-drawn art. The Stampers were paying attention, and – combined with the lush palette of the SNES, figured they knew their calling. All they had to do was make a simple platformer, then insert high-quality CG renders in place of bitmapped illustrations, and the entire game would look like a 3D-rendered cartoon. Bingo, massive hit – and one that, in Rare style – capitalized on the strengths of the target platform.
Nintendo loved the idea for a showcase title, and over Donkey Kong for rehabilitation (figuring, for one, that Donkey Kong was so underused that it didn’t matter so much if someone screwed with him – for another, if Rare pulled through then the contrast with the original games would be particularly striking). At the same time, Rare set to work on one of its few real arcade games: a home-rendered answer to Mortal Kombat called Killer Instinct. It was pretty successful, and certainly a better game than Mortal Kombat ever was. Donkey Kong Country, though – well. It was both a stupid success, and kind of a stupid success.
Although the game was very shiny, and there wasn’t any particular problem with its design, it was also extremely vapid – to that date, perhaps the closest thing to a platformer-by-numbers that Nintendo ever published. While woking on Yoshi’s Island, Miyamoto expressed distaste for the game and irritation that his superiors kept pressuring him to make his game more like Rare’s. Furthermore, as time has moved on, the early CG rendering tends to give the game a tacky look. Still, it sold eight million copies – and that was enough to cause Nintendo to buy out 25% of Rare’s stock, at that time an unheard-of gesture for the company. And practically on their own, sales of Donkey Kong Country were enough to set the Stampers back for life.
Indeed, It was around here that Rare started to dial back on its frenetic pace; gone were the years of a dozen releases, each containing a handful of new ideas, one or two maybe a rough classic. By this point the Stampers felt that they had pretty much sussed out the Nintendo formula, and as Tim Stamper put it at the time, “I’d rather see one single high-quality game than ten low-quality games.”
Fast forward to 1997; Rare’s progress to date has been two Donkey Kong sequels, one sequel to Killer Instinct, five Donkey Kong/Killer Instinct-related ports, a Mario Kart clone starring the Donkey Kong cast, and a baseball game. Then come two curious N64 games: Blast Corps, an experimental little game where a bulldozer has to clear the path for a truck loaded with nuclear missiles, calling to mind such NES experiments as Cobra Triangle and R.C. Pro-Am; and a stab at a first-person shooter, based on a James Bond movie that had been out of theaters for ages.
The Bond game was Nintendo’s idea; Rare was hesitant to take it on, though eventually acceded, choosing to staff the game almost exclusively with people who had never designed a videogame before – almost as much to see what happened, as anything. This group of greenhorns plugged away for two and a half years, switching platforms once (from the SNES to the N64, explaining the huge delay), producing a game that frankly nobody outside the subgroup was very enthusiastic about. The broader Rare was supportive, though, as David Doak said, “no-one really thought [it] was going to be any good. The general feeling was we were a bunch of students wasting time.” No one paid attention at trade shows. On release, it got okay reviews and it sold all right, though not spectacularly. And then over the years it just kept on selling. By the time the N64 was retired, GoldenEye 007 had sold eight million copies, making it – at least in the US – the biggest-selling game for the system.
It is also in 1997 that Rare first began to bleed staff; Sony lured away a bunch of background artists from Donkey Kong Country, and a few programmers from Killer Instinct; whatever they were up to never manifested itself. The following year, eight months into production on Perfect Dark, a bunch of the GoldenEye staff took off. Martin Hollis, who had been in charge of both projects (and indeed had convinced the Stampers to go forward with GoldenEye) left to advise Nintendo of America on the GameCube. David Doak, writer and designer on the games, chose to form his own studio – Free Radical – taking a bunch of his “B-team” staff with him. Meanwhile the main branch of Rare released Banjo-Kazooie, a 3D platformer that took Mario 64 and fleshed it out in ways that maybe it wasn’t meant to be. It sold okay, too.
1999 brought a 3D entry to Rare’s Donkey Kong series, often considered one of the worst things Rare ever produced; a curious cooperative shooter called Jet Force Gemini, and a couple of Game Boy Color releases. In 2000, Rare put out a Mario Kart clone with Mickey Mouse in it; a poorly-received sequel to Bano-Kazooie that, if anything, made the problems in the original game clearer; a Game Boy Color port of Donkey Kong Country, and – finally – Perfect Dark. Despite the staff turnover and many conceptual problems, the game was released to a solid, if not amazing, reception – usually blamed on how venerable the N64 was by then. As soon as the game was done, fifty more employees streamed out the doors.
2001 gave the foul-mouthed Conker his day out; the consensus seems to be that the profanity and violence wasn’t really funny or interesting, and neither was it as shocking as it pretended to be. The game was pretty solid, though, if not very inspired. In 2002 Rare delivered its only GameCube game, a by-the-numbers Zelda clone with the Starfox characters shoehorned in at the last minute. It sort of sucked. And then… it was over.
Out what what seemed like nowhere, Microsoft came along and scooped up Rare for a boggle-worthy $377 million; Nintendo allowed Rare to keep all its own original properties, including games that Rare had been working on for the GameCube. Thirty more employees fled the ship. The Internet collectively wondered how Nintendo could survive without Rare. Microsoft cackled with glee, as it had claimed one of the biggest, most well-respected studios in the West. And then, over the course of three years, Rare released two Xbox games. Both were tremendous flops. Rare readied two of its biggest GameCube projects for the launch of Microsoft’s next-gen console – after almost blind pre-release hype from every gaming publication, sold… okay. Kameo has mostly been ignored, and Perfect Dark Zero has been pretty thoroughly savaged. The bulk of Rare’s output since its total buyout has been, ironically enough, for the Game Boy Advance – yet another set of Donkey Kong ports, plus a few mascot-driven games drawn from Rare’s character vault.
Rare was a hard-working little company, based in the primal need to “make it” in the bigger world, to pull itself, and anyone else it could take with it, out of the morass of what its founders considered a complete lack of perspective or ambition, in its home market. They studied the hardware backwards and forwards; they pored over everyone else’s games to see why they were successful. They approached the industry, and the medium, like scientists, or alchemists, looking for the secret formula that would explain the mysteries of the Cosmos – and all the while, as they looked and mimed and probed and created their own sloppy, strange monsters in the dapperest suits in the world, they were essentially tapped into that which they were trying to find. Or worst, they were only ever a step or two away from greatness, from completely “getting it”; at best, they pushed their hardware of choice to places where nobody else even dreamed it could go – and that was just before lunch.
And then somewhere along they got desperate; it had been so long, and they’d still not completely cracked the formula – so they started to lose confidence. The world was moving on, around them – and they had to stay on top, or their whole mission was a failure. In comes the devil. They saw a facile, if ritzy, solution, and they went for it. The result: fame and fortune beyond their dreams. Clearly they had found the answer, so there was no point in looking further. Even their role model, Nintendo, showed faith in them (albeit during one of Nintendo’s wonkiest stages, creatively). They had made it – and it all came from filling in the blanks. Take basic 2D platformer, add rendered sprites. Take Mario 64, add more things to collect. Take Zelda, add more tasks and junk to collect. They knew what they were doing, so now all they had to do was do it.
Rare means well, it always has. Still, there you go: a prodigious little company that showers earnest little gem after gem, to the point where its British fans were flocking them like the Beatles, anxiously waiting for the next single, to a monolith that puts one one lame platformer every other year. The one thing they never really found is subtlety. Balanced out between a dozen bizarre experiments, you can sense it – in the network of their scrambled efforts. With all their energy focused on one huge game at a time, spent on following “correct” pattern of design to the scientific letter, that humanity’s gone. There’s no room for nuance when you’ve got all your eggs jammed in someone else’s basket.
id Software
Though id is most remembered, and sometimes cursed, for kicking off the 3D era and it’s done to and for the industry (particularly on the PC end), id’s first real impact lay not in the graphics card wars or online techno-jock matches, but rather – like Activision and EA before it – in a new way of doing business. That, and bringing a new perspective to American PC software – one that perhaps the Stamper Brothers would have appreciated.
Though when John Romero, John Carmack, and Tom Hall were all employed at Softdisk – sort of a digital magazine company – they were ostensibly there to write PC games, to them videogames meant Nintendo: products with action, smooth control, and a tactile world. The problem was, at that time the PC was primarily a business platform; although games certainly existed, and faster processors, improved displays, and sound cards were all conspiring to make those games more viscerally appealing, in general the PC was still more of a hobbyist platform, geared primarily toward the Dungeons & Dragons audience. Though technically more powerful than the consoles of the time, the hardware wasn’t designed or dedicated to run Altered Beast: it was built to run WordStar. Perfectly fine if you want to be the next Richard Garriot; frustrating for the midwestern kid who breaks into the game industry, hoping to be the next Miyamoto.
As it so happens, the trio in question was clever enough, and their workplace atmosphere oppressive enough, that they spent nearly all of their free time subverting the system: working on their own experiments on company time, conspiring together explicitly because they weren’t supposed to collaborate openly. Out of his doodling, John Carmack managed to cobble together a routine for a smoothly-scrolling game level – much as in nearly every console game after Super Mario Bros. Staying up all night, Carmack and Hall reproduced the first level of the then-new Super Mario Bros. 3, in near pixel-perfect detail (at least as well as EGA allowed), then replaced the main character and left the game on Romero’s desk as a gag. Romero flipped out and showed the game to his bosses, who passed the game under Nintendo’s nose. After some consideration, Nintendo decided it wanted nothing to do with PC games and walked away. For its part, Softdisk wanted nothing to do with the graphics routine because it didn’t support four-color CGA graphics (which I can attest many people were still using in 1990).
It was around this time that a guy named Scott Miller started pestering the Softdisk crew, sending reams of fan-mail praising Romero’s work in particular. After some confusion, it turned out that this guy was the head of a small company called Apogee – one of the first outfits to distribute games as shareware. To date he’d had modest success, though he was hoping to lure Romero away to – perhaps – their mutual benefit. When Romero showed him the Mario demo, Miller offered to pay them whatever they needed to get the game off the ground. He had five thousand dollars in the bank; they asked for two.
So, gearing up after hours, the trio set out on their first official project together and the first game they could really call their own. Tom Hall came up with the concept of a boy genius on an intergalactic mission; Carmack and Romero set to programming and designing the game. Toward the end of production, Tom Hall realized he wasn’t as good an artist as he thought – so he called on an intern named Adrian Carmack (no relation), whose art had impressed him, to do some touching up.
Commander Keen was an instant sensation; one month after they sent the discs off, Miller sent them back a royalty check for ten thousand dollars. After blinking a few times, the first thing Carmack and Romero did was invite their boss to lunch and resign in the name of all four designers. A few weeks later, in February of 1991, id Software was officially formed – though before they could really buckle down, they still had a bunch of obligations to fill for Softdisc. Over the next year, while fiddling with the follow-up episodes to Keen and planning its sequel, the id crew worked like a renegade Stamper clan, staying at work to all hours, “borrowing” office computers for the weekend then returning them on Monday morning before anyone noticed, juggling half a dozen games at once, in all genres, with all different kinds of technology, compressing years of experience into months.
Meanwhile, Apogee released its own action platformer called Duke Nukem (liberally borrowing graphics from other games), to similar success – further helping to establish Shareware as a model for game publishing. It would be id’s next major release, though, that really drove the point home – and that put shareware squarely in the industry’s attention.
All it took was being in the right place at the right time; in 1991, John Carmack happened to see a demonstration of Ultima Underworld, Blue Sky’s upcoming 3D dungeon-crawler for Origin – and something in him popped. Although true 3D would take a tremendous machine to process – indeed an issue with Blue Sky’s game, on release, perhaps if Carmack could find a neat way to fake it, to make a sort of smooth 3D scrolling to match his smooth 2D scrolling, that could run on any system… maybe he’d be onto something. After a couple of rough demos for Softdisc – one with blank, colored walls and a rattly 16-color thing with texture-mapped walls, Carmack perfected a raycasting technique (sort of like projecting imagery through a prism) that produced a convincing effect at a very low burden to the processor. Basing their next project on one of their favorite Apple II games, in 1992 id presented Apogee with Wolfenstein 3D. And suddenly the PC was the cool place to be.
A year later, Carmack perfected his raycasting engine and Romero perfected his design in Doom – reaching a level of breakthrough success unprecedented for a PC game, for a shareware game, and for a small team of kids who were told they couldn’t make the kinds of games they liked. With Doom, the aesthetic of the first-person shooter was essentially stabilized – incorporating elements of adventure, exploration, horror, suspense, and dark humor with the simplistic, console-like controls and premise (qualities often overlooked, in subsequent FPSes – that is, until Half-Life). Not only was Doom possible on a PC, and not only could it run on nearly any modern system; it also was patently impossible on the consoles of the day – at least without sacrificing a good deal of quality. id had turned the clunky PC into an action powerhouse, and freely given away the keys to whoever want to hop in and test it out. The game industry has never been the same since.
Where it all went wrong, of course, is where id began to push the technology further than current systems knew how to handle – then started intentionally designing years into the future, just expecting the hardware would catch up with them, starting a mad rush for the fastest, most expensive systems and graphics cards and in turn creating an expectation that every game had to milk the newest hardware to its limit, creating a spiral that sent PC games back into their niche, where only the hardest of the hardcore could keep up with them and only the flashiest games had a chance of selling to this increasingly narrow audience. Today the cycle has kind of hit a lull – to the point where a halfway decent card from a few years ago is more than adequate to play the biggest games released last week, and where a bunch of subsided genres have started to make a comeback. Still, yikes.
And the other silly thing is that after the perfection of Doom, id began to pay less and less attention to the actual designs of its games, figuring its energy was better spent on the technology. Compared to Doom, Quake was a dreary and no-frills package (however impressive the engine and Nine Inch Nails were); Quake II barely had a single-player mode. Somewhere in between, John Romero got himself fired and stormed off in a huff, determined to prove the dominance of design over technology. (Tom Hall had already left, over squabbles with Romero.) In 1998 Half-Life exploded into prominence; since there was no competing with that, Quake III moved its emphasis fully over to multiplayer, resulting in criticism that it was more a ridiculously elaborate game engine than a videogame – much like, come to think of it, id’s last couple of games. Even as a multiplayer sports event, the game was to some extent overshadowed by Epic’s Unreal Tournament.
At this point the Carmacks were alone as the (now very rich) company’s elder statesmen. John Carmack began to give increasingly strange interviews, where he spoke of his boredom with game design and of the new one-man space rocket he was building in his back yard. Longtime associates like Raven and Grey Matter began producing id’s games for it – sequels to Wolfenstein and Quake. Eventually, feeling the pressure to make a “real” game again, id dialed back the clock and tried a modern-day sequel to Doom. Though again technically masterful, the game itself arrived at the same time as Half-Life 2; by comparison, id’s game was criticized as clumsy, trite, and old-fashioned. Finally last year, following an attempt to sell the company to Activision, Adrian Carmack found himself fired from his own company, leaving John – a man more interested in his code than in game design or in running a company – as the sole figurehead.
id was the heavy metal band of game studios. They made their smash debut, their even bigger follow-up, they became rich and powerful overnight – and then slowly the excess and the egos ate away all they had made, as each member backed into his own corner, then vanished. And then there was one, the quietest and the brightest and the least interested of all. At their earliest and best, when all their energies were still on making the best game possible available to as many people as possible, they were an inspiration. When they started to get distracted, they became a distraction. When they began to fragment, they became a disappointment. Now at least, with the squabbling over, I get the impression that John Carmack has been granted a kind of peace. Perhaps that’s all one should hope for in the end.
Weaving the Web
The theme this week, for those who have been following, is independence: the struggle to achieve it for one’s self or for others, or simply its presence in places where you’d never expect to find it. It is that cooperative struggle for individuality, for the assertion of our humanity, that leads us to all grace. And it is the abandonment of confidence, of trust for the power of the humanity in ourselves and in others, that leads to cynicism – to desolation, to loss, to potential unfulfilled, to everything that makes life disappointing today and depressing tomorrow.
These have been five tales of bold, inspired steps for the improvement of the entire game industry – for in our every action each us leaves ripples that will take years to run their course – of five blazing stars that petered out, often before their original missions had even been achieved. The purpose of this article lies not so much in criticism of the figures in question, as in observation of what became of their ideals, and why. It’s a subject worth studying, I think; the more you know of the pitfalls and what lies at the bottom, the more inclined you are to watch your step in the future.
If the last couple of years have shown me anything, it’s that there is no bad training that can’t be untrained, no giant that can’t be felled, no accepted knowledge that can’t be challenged – and given a voice and a focus, humanity does tend to win out in the end. I think this industry’s headed for something grand, before too long.