Phantom Fingers: The Series — Part Five: Myths and Legends

  • Reading time:1 mins read

It is 1981. Somewhere between testing and mass release, interest in Nintendo’s Space Invaders clone Radar Scope had cooled. It’s not that the game was poor. It’s just that six months earlier Pac-Man had changed the arcade landscape, and in the narrowing landscape for Invaders clones there was only room for excellence. Do we order Radar Scope, or do we order Galaga? Easy choice.

Enter the slacker art school kid who was only ever hired as a favor to his family. Shigeru Miyamoto was told to recoup losses by designing another game for the returned Radar Scope hardware, preferably aimed at US audiences. Inspired by Pac-Man, Miyamoto took pretty much all of Iwatani’s new ideas of scenario, character, empathy, and play narrative, and pretty much built a whole game on them without the traditional clutter.

( Continue reading at Game Set Watch )

Craft Service

  • Reading time:2 mins read

by [redacted]

Over the years, game design has calcified. If I were to pick a turning point, I might point at the SNES — a system of broadly appealing games that delivered exactly what people expected of a videogame, challenged few perceptions, and established the status quo for 2D console-style game design. Since then it’s been hard to get past the old standards — the prettied-up enhancements of Super Mario 3, The Legend of Zelda, and Metroid that added little new in terms of expression or design language, yet that refined the hell out of some proven favorites.

You could say that the SNES was the epitome of Miyamoto-styled design (even in games by other developers), and you’d have a reason for saying that. Namely, it was the Miyamoto Box: Nintendo’s reward to Miyamoto for the broad appeal of his NES catalog. Meanwhile Miyamoto’s opposing force, in Gunpei Yokoi, was rewarded for his invention of the Game Boy by having his studio removed from mainstream console development to support his brainchild. The message was clear: Miyamoto’s way was the successful one, so he would be in charge of everything important from here on.

The thing is, Miyamoto is just one voice. He had a few brilliant ideas in the mid-1980s, all born out of a particular context and in response to particular problems. And then by the turn of the ’90s he was pretty much dry. All that was left was to codify his ideas, turn them into a near law of proper design — regardless of context — and then sit back to admire his work, while new generations carefully followed his example as if manufacturing chairs or earthenware pots. A videogame was a videogame, much as a chair was a chair. It was a thing, an object, with particular qualities and laws.

Thing is, videogames aren’t things; they’re ideas.

( Continue reading at DIYGamer )

The Playlist / Those Tenuous Twos

  • Reading time:23 mins read

by [name redacted]

You may have read the first part of this column in the December 2009 Play Magazine. It was intended as a single article, and the start of a whole series of such lists. In the event, I was asked (due to my incorrigible verbosity) to break the article into three pieces; only the first found its way to print. Here is the column in full.

Used to be, when a game was successful enough to demand a sequel, the design team would do its best to avoid repeating itself. Though I’m sure they mostly wanted to keep their job interesting, the practical effect was that if the games were different, they would both remain relevant. In an arcade, Donkey Kong Jr. could stand handsomely by its father, each shilling for its own share of the coin. You might call them companion pieces, rather than updates or replacements.

When home consoles hit, design teams were even more modest, and were generally left to do their own thing. So starting on the NES, you will see a certain trend: successful game spawns weird, only tenuously related sequel; fans of the original scratch their heads; a greatly expanded dev team releases a third game, which is basically just the first again, on steroids; fans think it’s the best thing ever, because it’s exactly the same, except better! And to hell with that weird second chapter.

Thing is… usually the second game is the most interesting you’ll ever see.

The New Generation – Part Two: Masterminds

  • Reading time:23 mins read

by [name redacted]

Originally published by Next Generation.

Something is happening to game design. It’s been creeping up for a decade, yet only now is it striding into the mainstream, riding on the coattails of new infrastructure, emboldened by the rhetoric of the trendy. A new generation of design has begun to emerge – a generation raised on the language of videogames, eager to use that fluency to describe what previously could not be described.

First, though, it must build up its vocabulary. To build it, this generation looks to the past – to the fundamental ideas that make up the current architecture of videogames – and deconstructs it for its raw theoretical materials, such that it may be recontextualized: rebuilt better, stronger, more elegantly, more deliberately.

In the earlier part of this series, we discussed several games that exemplify this approach; we then tossed around a few more that give it a healthy nod. Some boil down and refocus a well-known design (Pac-Man CE, New Super Mario Bros.); some put a new perspective on genre (Ikaruga, Braid); some just want to break down game design itself (Rez, Dead Rising). In this chapter, we will highlight a few of the key voices guiding the change. Some are more persuasive than others. Some have been been making their point for longer. All are on the cusp of redefining what a videogame can be.

The New Generation – Part One: Design

  • Reading time:15 mins read

by [name redacted]

Originally published by Next Generation.

An idea is healthy only so long as people question it. All too often, what an idea seems to communicate – especially years and iterations down the line – was not its original intention. Context shifts; nuance is lost. To hear adherents espouse an idea, measureless years and Spackle later, is to understand less about the idea itself than about the people who profess it, and the cultural context in which they do so.

In 1985, an obscure Japanese illustrator slotted together a bunch of ideas that made sense to him that morning, and inadvertently steered the whole videogame industry out of the darkest pit in its history. Since that man’s ideas also seemed to solve everyone else’s problems, they became lasting, universal truths that it was eventually ridiculous – even heresy – to question.

So for twenty years, skilled artisans kept building on this foundation, not really curious what it meant; that it worked was enough. They were simply exercising their proven craft, in a successful industry. Result: even as technology allowed those designers to express more and more complex ideas, those ideas became no more eloquent. The resulting videogames became more and more entrenched in their gestures, and eventually spoke to few aside from the faithful – and not even them so well. Nobody new was playing, and the existing audience was finding better uses for its time. A term was coined: “gamer drift”.

Touch Generations

  • Reading time:13 mins read

by [name redacted]

Originally published by Next Generation, under the title “FEATURE: A Short History of Touch”.

A few years ago, Nintendo launched the DS with a vaguely unsettling catch phrase: “Touching is Good”. Their PR team sent disembodied plastic hands to everyone on their mailing list, in the process creeping out Penny Arcade. As creepy and forward as the campaign was, it had a point. Touching historically has been good, for the game industry.

On a whole, videogames are an awfully lonely set of affairs. They paint an alluring well, then give the player rocks to throw, to see what ripples. From Spacewar! to Pong, you’re always shooting or batting or throwing some kind of projectile, to prod the environment. Even in some of the most exploration-heavy games, like Metroid, the only way to progress is to shoot every surface in sight, with multiple weapons. Little wonder art games like Rez are based on the shooter template: it’s about as basic a videogame as you can get. See things, shoot things, you win. If things touch you, you lose. Except for food or possessions, generally you can only touch by proxy; toss coins into the well; ping things, to see how they respond. To see if they break.

Dead Rising: A Trope Down Memory Lane

  • Reading time:1 mins read

by [name redacted]

In 1985, Shigeru Miyamoto came to down with a truckload of tropes, and they were so wonderful, they did such a great job at filling the creative vacuum of the time, that it took two decades for people to notice the limits to their application. Now, step by step, we’re kind of getting back our perspective. Under Satoru Iwata’s oversight, Nintendo – so long, so much to blame for the entrenchment – has painted a huge “EXIT?” sign in the air, with a wave and a sketch. Valve has suggested new ways to design and distribute software. Microsoft and Nintendo have tinkered with how videogames might fit into our busy, important lives. Blog culture is helping aging gamers to explore their need for games to enrich their lives, rather than just wile them away. And perhaps most importantly, the breach between the Japanese and Western schools of design is finally, rapidly closing.

( Continue reading at Game Career Guide )

Balloon Fight (***)

  • Reading time:1 mins read

by [name redacted]

Time was, Nintendo was a company was a game. Then Mario was a commodity was a template was a cult.

The guy who dragged Japan’s oldest hanafuda manufacturer into videogame design was a quiet, oddball toy inventor named Gunpei Yokoi. Thanks to Yokoi, Nintendo had already been making “inventive and strange” toys and arcade amusements; in the late ’70s, videogames were just the next logical step. He rounded up a posse, agreed to babysit a slacker friend of his boss’s family, and built from the ground up Nintendo’s first design studio: R&D#1.

Before long, the kid — an art school graduate named Miyamoto — set the editorial tone of bold colors, bolder concepts, and boldest character design. Then he graduated again to set up his own internal studio, and over the next five years completed and refined the two or three ideas he would ever have as a game designer.

( Continue reading at ActionButton.net )

Donkey Kong 3 (*)

  • Reading time:1 mins read

by [name redacted]

It’s been said that each of us only has one tune to play; all we ever do is change the way we play it. It’s also been said that Donkey Kong and Mario creator Shigeru Miyamoto’s tune originates in his personal hobbies, filtered through a love of Japanese and Western fairy tales. The Legend of Zelda has its roots in the fields and caves behind Miyamoto’s childhood home. Pikmin comes from Miyamoto’s garden. And Donkey Kong 3 is based on the premise that it is fun to spray DDT up a gorilla’s asshole. While being attacked by bees.

( Continue reading at ActionButton.net )

Aonuma’s Reflections On Zelda

  • Reading time:1 mins read

by [name redacted]

Check out the comments section on the original article. Seriously.

On Thursday Aonuma candidly, and with self-effacing humor, spoke of his period of aimlessness and mistakes that began with the release of The Legend Of Zelda: Wind Waker, the way in which they reflected the Japanese industry as a whole, and how they led to Nintendo’s shift of focus over the last few years.

( Continue reading at GamaSutra )

Horii Himself, Out.

  • Reading time:6 mins read

Yeah. This doesn’t completely surprise me, except in the sense that it actually happened.

Handhelds are a better place for introverted, focused experiences. (See Metroid II.) In terms of the mindset involved, playing a handheld is like reading a book, whereas playing a console is like watching TV. Again, look how perfect Dragon Warrior is on the Game Boy — how much better it is than on the NES. Also: having a lengthy “novel” game makes more sense if you can pick it up and put it down at leisure, rather than being forced to sit in one place and stare at a screen for hundreds of hours. Leave the consoles for flash and fun; visceral stuff. Like the Wii, say.

Also to consider: as great as DQ8 is, there are two major abstractions left that seem kind of contrary to what Horii wants to do with the series. For one, the player controls more than one character. That’s a little weird. For another, it’s got random turn-based battles. Honestly, that doesn’t seem like part of Horii’s great plan for the series. It never has; it’s just been something he’s settled with until now.

So yeah. The DS seems like an ideal place to put the game. What’s really interesting is the multiplayer aspect — which I didn’t expect at all, yet which again sort of makes sense, depending on how it’s implemented. If players can come and go at will — join each other or set off on their own tasks, each with his or her own agenda — it’ll work. If there are too many constraints to the framework, keeping people from just playing the damned game whether their friends are around or not, it’ll be a bit of a downer.

I’m kind of undecided what this game means in the end. On the one hand it seems likely it’s meant as an intermediary step while Horii works on Dragon Quest X for the Wii. Considering how far along this game seems to be (implying it’s been in the works for at least months, maybe a year), it seems like it’s part of a long-term plan. Also considering that the sword game seems basically like a testing bed for a new battle system… well, do the math. And yet, there’s this issue about the DS actually being the most suitable system out there right now (in terms of market saturation, the nature of the format, and the qualities it has to offer).

Maybe it’s just the most suitable platform for Dragon Quest IX in particular, for everything he wants to do with the game. If X is going to work the way I think it might, it’s going to pretty visceral and showy — demanding a home system. One in particular (that being the most visceral available).

Basically, every game Horii makes appears to be just another approach to the same game he’s been trying to make for twenty years. He never quite winds up with what he wants — though lately he’s getting a little closer. From what I can see, this is just one more angle, allowing him to capture a certain aspect of his vision that he hadn’t been able to before (perhaps at the expense of some other elements, that he’s already explored). So, you know, right on. These details seem worth exploring.

The next game… maybe it’s time to assemble? See how all the pieces fit?

The thing that I dig about Dragon Quest is that, whatever the surface problems, the games are visionary. It’s a strong, uncluttered vision that all the games reflect even if they don’t always embody it. As “retro” as they seem, they’re not just crapped out according to a formula; they’re each trying to achieve something that’s way beyond them — meaning an endless pile of compromises.

I find that pretty encouraging. Not the placeholders; the way Horii isn’t afraid to use them, while he roughs out everything else. And that he doesn’t let them distract him; he just devises them, then discards them when they’re no longer of use. He keeps chugging along, going through draft after draft until he gets it exactly right. It’s a very classical disposition. Very honest, at least to my eye.

He’s a lot like Miyamoto, except Miyamoto sort of gave up a long time ago. And Miyamoto’s vision isn’t quite as focused (though in turn, it is broader than Horii’s).

The one problem I can see with going from turn-based to real-time battles is that the battles in Dragon Quest — I don’t think they’re really always meant to stand in for actual fighting, as much as they’re a stand-in for any number of hardships and growth experiences that a person like the player might encounter in a situation like the quest at hand. Some of that might be actual battle; some of it might be much subtler and harder to depict in a game like this.

Keeping the battles turn-based and separated from the wandering-around makes the metaphor a lot clearer as a compromise, rather than as something special or important in its own right. Changing to a system that makes the game actually about fighting loads of monsters… I’m not sure if this is precisely the point he’s looking for. Still, it’s a trade off. Get more specific somewhere, you have to lose a nuance somewhere else.

I wonder what other sorts of difficulties or experiences could be devised, besides semipermeable monster walls holding you back. Ones that would add to (or rather further clarify), rather than detract from (or muddy), the experience. And preferably that wouldn’t be too scripted.

I’m thinking a little of Lost in Blue, though I don’t know how appropriate its ideas would be, chopped out and inserted whole. Still, general survival issues seem relevant: having certain bodily needs (and maybe psychological ones — though who the hell knows how to address that) that, though not difficult to attend to, cause problems if you don’t. So in the occasions you do run into real immediate difficulty (battles, whatever), you’ll be in far greater danger if you’ve been pressing yourself too far; if you haven’t sufficiently prepared. Likewise, injury might be a real problem — so the player would have to think carefully, weigh cost and benefit, before charging into dangerous situations.

Not pressing out would mean you’d never learn more, get better, stretch your boundaries. Being foolhardy would get you killed. Same deal we’ve got now; just more nuanced.

I’m sure there are other ways to do it. Maybe more interesting ones.

It could be I’m reading in some things that aren’t overtly intended. Still, I’ve never felt the battles were as important as what they stood for. They’re too straightforward. They’re used too cannily, as a barrier. The trick, again, is whether there’s an interesting and functional way of more literally representing what they might stand for. I dunno. Maybe not! At least, not right now. So all right, violence. Fair enough.

Defining the Next Generation

  • Reading time:28 mins read

by [name redacted]

This article was originally intended as a conclusion to NextGen’s 2006 TGS coverage. Then it got held back for two months as an event piece. By the time it saw publication its window had sort of expired, so a significantly edited version went up under the title “What The New Consoles Really Mean”.

So we’re practically there. TGS is well over, the pre-orders have begun; Microsoft’s system has already been out for a year (and is now graced with a few excellent or important games). The generation is right on the verge of turning, and all those expensive electronics you’ve been monitoring for the last few years, half dreading out of thriftiness and secret knowledge that there won’t be anything good on them for a year anyway, will become the new status quo. Immediately the needle will jump and point at a new horizon, set around 2011, and everyone will start twiddling his thumbs again. By the time the drama and dreams resume, I’ll be in my early thirties, another American president will have served nearly a full term – and for the first time in my life I really can’t predict what videogames will be like.

Five That Didn’t Fall

  • Reading time:53 mins read

by [name redacted]

Part nine of my ongoing culture column for Next Generation. After the popularity of my earlier article, I pitched a companion piece about companies that had lived past their remit, yet technically were still with us. On publication we lost the framing conceit and the article was split into five pieces, each spun as a simple bottled history. In turn, some of those were picked up by BusinessWeek Online. Here’s the whole thing, in context.

A few weeks ago we published a list of five developers that made a difference, helped to shape the game industry, then, one way or another (usually at the hands of their parent companies), ceased to exist. One theme I touched on there, that I got called on by a few readers, is that although in practical terms all the listed companies were indeed defunct, several continued on in name (Atari, Sierra, and Origin), living a sort of strange afterlife as a brand detached from its body.

This was an deliberate choice; although Infogrames has been going around lately with a nametag saying “HELLO my name is Atari” – and hey, why not; it’s a good name – that doesn’t make Infogrames the historical Atari any more than the creep in the purple spandex with the bowling ball is the historical Jesus. (Not that I’m relating Infogrames to a fictional sex offender – though he is a pretty cool character.) The question arises, though – what about those companies which live on in both name and body, yet which we don’t really recognize anymore? You know who I’m talking about; the cool rebels you used to know in high school, who you see ten years later working a desk job, or in charge of a bank. You try to joke with them, and they don’t get a word you’re saying. You leave, feeling a mix of fear and relief that (as far as you know) you managed to come out of society with your personality intact.

The same thing happens in the videogame world – hey, videogames are people; all our sins are handed down. This article is a document of five great companies – that started off so well, ready to change the world – that… somehow we’ve lost, even as they trundle on through the successful afterlife of our corporate culture. And somehow that just makes us miss them all the more.

The Nose Before Your Face

  • Reading time:12 mins read

by [name redacted]

Part eight of my ongoing culture column; originally published by Next Generation, under the title “The Value of Simplicity”.

So lately we’ve been swinging back toward thinking about games as a medium of expression. It’s not a new concept; way back in the early ’80s, companies like Activision and EA put all their energy behind publicizing game designers like rock stars – or better yet, like book authors – and their games as unique works by your favorite authors. This all happened just after figures like Ed Logg and Toshihiro Nishikado started to extrapolate Pong and SpaceWar!, incorporating more overt narrative frameworks and exploring more elaborate ways of interacting with the gameworld. From this initial explosion of creativity came Steve Wozniak and the Apple II, providing an easy platform for all of the early Richard Garriotts and Roberta Williamses and Dan Buntens to come.

Then stuff happened, particularly though not specifically the crash; the industry changed in focus. On the one hand we had ultra-secretive Japanese companies that – like Atari before them – usually didn’t credit their staff for fear of sniping and for the benefit of greater brand identity; on the other, what US companies remained tended to inflate beyond the point where small, expressive, intimate games were economically feasible. And then there’s just the issue that, as technology grew more complex, design teams grew larger and larger, making it harder for any one voice to stand out, leading to more of a committee-driven approach.

And Then There Were None

  • Reading time:25 mins read

by [name redacted]

Part three of my ongoing culture column; originally published by Next Generation, under the title “Culture: Five that Fell”.

For all its immaturity, you can tell the videogame industry is getting on in years. With increasing, even alarming, frequency, the faces of our youth have begun to disappear – forced from the market, absorbed into conglomerates, restructured into oblivion, or simply retired from the grind.

The first big wave hit back in the mid ’90s, when increased development costs, the demise of the American arcade, and the shift from 2D development left dozens of small and mid-sized developers – from Toaplan to Technos – out in the cold. Those that didn’t die completely – Sunsoft, Vic Tokai – often pulled out of the US market, or even out of the videogame business. Western outfits braced for the storm by merging with larger and ever larger publishing conglomerates, rationalizing that it was the only way to survive in an uncertain market.

The second wave came only a few years ago, after the burst of the tech bubble. In effort to streamline costs, parent companies began to dump their holdings left and right, regardless of the legacy or talent involved. Those that didn’t often went bankrupt, pulling all of their precious acquisitions down with them. Sometimes the talent moved on and regrouped under a new game; still, when an era’s over, it’s over.