Showing posts with label games industry. Show all posts
Showing posts with label games industry. Show all posts

Tuesday, February 15, 2011

The Next Console Generation? Don't Hold Your Beath

Throughout the history of video games and video game consoles, one rule has held steadfast--every 5 or so years, the industry refreshes with what's referred to as a new "generation" of consoles, where each of the major console manufacturers trumpets out new, more powerful devices while slowly phasing out the old ones. 2011 marks the fifth year since the 2006 launch of the Nintendo Wii and PlayStation 3--and the sixth year since the Xbox 360's launch in 2005--yet not only have we yet to hear of any new machines, it does not appear likely that new systems will be entering the market until 2015. The reality is that nobody involved in the industry--from the manufacturers to the developers to the publishers to even the consumers--even want a new console generation anytime soon. And for good reason.

PS2: The First 10-year Console (and counting)
When Sony first launched the PlayStation 3, the Japanese tech giant defended the system's initially astronomical price tag by claiming that the PS3 was designed for a ten-year life cycle. While there were plenty of critics doubting Sony's lofty ambitions, in hindsight the goal was not so unrealistic given the ongoing success of the PlayStation 2. By the time the PS2 reached its 10th anniversary in 2010, sales had declined significantly from its heydey, but it was far from dead in the water (games continue to be published for the system to this day). In other words, a system planned to have a 10-year life cycle surely could achieve that goal with little trouble given the PS2's already-established lasting power (then again, the PS2 is the most successful console ever made).

However, the PS2 achieved this feat 4-5 years after the start of the current console generation, so it's not like it can really be regarded as the catalyst for the elongation of the life cycle. Instead, we must answer a fundamental question: why would Sony, Microsoft, or (to a lesser extent) Nintendo even want to release a new console right now? The PS3 and Xbox 360 are still growing and have yet to hit critical mass. Further, the cost of actually releasing a new console is tremendous, and the manufacturers often launch new consoles at a loss and make up the difference over time in game licensing fees, accessory sales, and (down the road) cheaper and/or more efficient components. There's no reason for these companies to go back into the red when the current generation is still in its prime and still growing. The economic turmoil that has tightened up budgets around the world--and that played a role in slowing growth in the generation's early years--has only reaffirmed this position.

The PS3 Slim makes the unit cheaper for Sony to
manufacture.
(Note: the Nintendo Wii may be an exemption to this logic and will be excluded for the rest of the article. The Wii experienced its first sales decline in 2010 and--while still strong and still the overall market leader--there is no doubt that the system will reach its demise first of the three, especially considering that it was a far weaker console to begin with. However, still don't expect a successor until at least next year; the Wii still retails for $200, meaning there's plenty of room for sales-invigorating price drops).

To further strengthen the console manufacturers' position, the developers and publishers are more than happy to keep the current consoles around for as long as possible. The installed bases (number of people who own the current consoles) are still growing. A new set of consoles would mean higher development costs and a need for developers to learn how to develop for entirely new computer architectures when they are only just now tapping the full potential of the PS3 and 360. The publishers would also not want to combine these higher costs with installed bases that would inherently be only a fraction of what's available right now, which would severely limit the sales potential of any next-generation game, no matter how good. The increasing installed bases of the current consoles also decreases the risk of developing middle-ranged niche titles that can sell to just a small segment of the market rather than needing to achieve AAA blockbuster success.

Because of Xbox Live's heavy integration to the 360 dashboard,
Microsoft was able to completely overhaul the dash to
accommodate changing times.

Any rational consumer doesn't want a new console either. With the developers finally getting a firm grasp on development for this generation, gamers are able to play increasingly higher-quality games. The aforementioned benefits of growing installed bases also means that developers and publishers will be willing to take more risks, which opens up the door to new, more radical gameplay ideas. Consumers also get the advantage of sequels and entire trilogies that can actually build off previous titles in the same generation; it would be almost impossible for BioWare's Mass Effect trilogy to maintain the sense of continuity from game-to-game if the generation were cut short as currently these games import previous games' save data directly off the hard drive (of course, a cloud-based memory solution like what Sony's launching with PlayStation Plus could alleviate this hurdle). Perhaps most importantly, these consoles still cost about $300, or the traditional launching price of a new console (though that has already been thrown out the window). Should sales start declining like what the Wii's beginning to see, there's plenty of room for price drops, which trigger a massive increase in sales. With the abundance of quality games still being released for the current consoles, the last thing a consumer wants to do is shell out hundreds of dollars to start over again.

Kinect and the Xbox 360 S: a recipe for longevity.
There are several other factors in this generation's elongation. For one, the primary driving force in the launch of previous generations was technological advancements that allowed for a significant increase in pure graphical power. Let's face it: no matter how much more realistic future consoles may get, any improvements would only be incremental to the strikingly realistic visuals already achieved by today's consoles. Another game-changer is the current consoles' integration with online connectivity. The PS3 and Xbox 360 as they exist today are far more advanced than the machines that launched, even with with the slimmer redesigns aside. Having the Internet so thoroughly integrated into the experience means that the consoles are able to evolve as time goes on--something that previously necessitated an entirely new hardware product. Already we've seen things like Netflix, Hulu, ESPN, MLB.tv, NHL GameCenter, Last.fm, Facebook, Twitter, Zune, IPTV, and a whole bunch of other applications get added to the consoles' stable of features.

Finally, 2010 saw the launch of major peripherals for both the PS3 and 360 that provide a new method of interaction that could easily be associated with a new console, but instead are simply additions to the current hardware. Sony and Microsoft certainly hope that the PlayStation Move and Kinect Sensor for Xbox 360 can carry their respective hardware for the equivalent of a new generation, but as we've been over this whole time, even if they flop (which they aren't), these consoles are plenty capable of continuing growth on their own.

Tuesday, February 8, 2011

The Marketing Departments Have It All Wrong

Successful marketing is an incredibly useful tool to get consumers to buy your product, and the world of entertainment media is no exception to that rule. There are countless examples across all forms of media of just how important marketing (or lack thereof) can be to a product's ultimate financial success. While video games have proven to be an incredible gold mine, often generating more income than Hollywood movies, the industry as it exists now is far from sustainable. Publishers need to pour tons of money into development for today's consoles, meaning that there is an extremely high risk if the product fails--which is why we see fewer original IPs and a market that is dominated by big-budget sequels. And therein lies the biggest problem of the games industry: the publishers don't understand how to market their product. "But I see Call of Duty and Halo ads all the time!" one might retort. It's true; publishers do a pretty good job of pushing their biggest brands and creating financial juggernauts. Unfortunately, that means a select few games are basically driving the entire industry, a situation that doesn't need an insightful analyst to realize that it's not exactly a formula for long-term survival, let alone growth. The problem is exacerbated when companies insist on pumping out new titles for these brands year in and year out, which not only burns out the developers and creativity, but dramatically accelerates what is known as "franchise fatigue" for the consumers. Just look at what happened to Guitar Hero and Rock Band: what were once the financial gems of the industry for their ability to not only move millions of units of games, but also high-return plastic instruments, are now a toxic asset that has become dead in the water just as fast as it rose to glory. (In case you're wondering, Activision published a whopping 25 Guitar Hero SKUs in 2009 alone--a year that also saw revenue in the overall genre drop 50% to $700 million from $1.4 billion the year before).
Once a breakout hit, Guitar Hero is now
only a shadow of its former glory

Now, one could easily argue that the music game genre's catastrophic collapse had more to do with Activision's greedy higher-ups than with a marketing department failing to successfully handle so many similar products. But this argument only circles back to the underlying trend that is the fault of the marketing departments: they are not marketing the right product. We've already gone over how publishers focus on select franchises and try to pump out as much money as they can from these names. The fact of the matter is, they are promoting the wrong names, period. Rather than promoting the game brand itself, the publishers should be promoting the "brand" that is the talent behind these games. A bit altruistic, you might say, but let's take a closer look by peeking over at the much-compared medium of movies. Certainly there are cases where movie studios are able to push a brand to sell a movie (Star Wars, James Bond, pretty much any movie based on a comic book come to mind). But if you think about the way most movies are promoted and hyped up, it's usually one of two things (or a combination): the director and the actors. These studios often bank on big-name talent to bring consumers to their product. People already know they are probably in for a good movie if Steven Spielberg is the director or if Russel Crowe is the lead actor. These kinds of "stars" do not exist in the games industry--which is entirely the fault of the people who promote the games. Of course, game development is inherently more team-oriented and less based on individual talent, but that is why the industry should start focusing on promoting its specific studios as its stars.

If you don't know Bungie, you certainly know
 its most successful creation.
To some extent, this has already happened. Sort of. People who follow the industry closely know the name "Bungie" carries a whole lot more weight than "Saber Interactive" because they already know what Bungie's accomplished and know what they bring to the table. Bungie hasn't announced so much as a title or even genre for their next game, but gamers are already salivating for it. The real money, however, is not in the people that already know what games they are going to buy, but rather the mainstream market that pretty much bought their Xbox for Call of Duty, Halo, and Madden... because that's all they know. You're average Joe probably doesn't know the difference between an Infinity Ward Call of Duty game and a Treyarch Call of Duty game. The existing practice obviously devalues the developer, but that's not as important to the publisher's bottom line as the eventual sales, so they promote the game brand--which they have more control over--instead. However, incentives exist for publishers as well. If publishers start promoting their studios rather than just the game brands, they'll be able to create completely new IPs (and to them, potential new series/franchises) on the backs of those developer names alone (i.e., without the high financial risk usually associated with a new IP). They don't have to waste money explaining why Game X is the next big thing while not-so-subtley hinting that its sequel is already in the works and will be even bigger. Instead, they can simply say, "this is the newest game from Bungie." Boom, millions of copies sold, done deal (the studio has to actually make a good game of course, but a studio that is able to consistently feed its creativity is far more likely to be working as hard as it can).

This is a bold proclamation that might seem like a too big of a step for the industry's tepid publishers. Luckily, there is already an example of this practice achieving mainstream success. For years, Rockstar was known for one game franchise and one game franchise only: Grand Theft Auto. Anybody who's reading this knows that those games don't need any introduction. It's a game brand that practically prints money. Or is it? Last year, Rockstar unleashed their latest project, and it wasn't a game about urban crime. Rather, it was a brand-new game set in the Old West. It was called Red Dead Redemption and went on to sell millions upon millions of copies and is heralded as one of the best games of last year. While the game is technically a sequel, it shares no ties to its under-the-radar predecessor, which wasn't even originally developed by Rockstar (the former Capcom game was dumped and Rockstar swooped up the rights, pushed it out, then geared up for the project it really wanted to do). How could what is essentially a new IP make such a splash right out of the gate? Because Rockstar was able to promote the Rockstar brand instead of the game's brand. Very clearly above the title, and before every trailer and commercial, reads a line that says, "Rockstar Games Presents." As in, "Rockstar Games Presents Red Dead Redemption." At the very least, pushing their name on the front of the box puts their name out for the future, so even if you didn't know they made GTA, you'll know they made RDR when they ship their next game (which, by the way, is L.A. Noire, which will also carry the "Rockstar Games Presents" tag on its box).

By doing this, Rockstar has taken steps to promote the Rockstar brand. Suddenly, the company has two mega-blockbusters, yet its next two games (L.A. Noire and Max Payne 3) are not from either of those franchises. They have been able to succesfully promote the Rockstar name, which allows them to explore game ideas in all sorts of stories and settings, rather than endlessly iterating on GTA until consumers finally get sick of it. Not only that, but when Grand Theft Auto V eventually does drop, it will come to additional applause and fanfare for returning from a long absence, which generates all sorts of hype on its own. And in the meantime, Rockstar is still making a pretty penny on other, brand new titles, because people know that it's not just that GTA is a good game, it's that Rockstar is good at making good games. Being able to say "from the studio behind Grand Theft Auto" is an incredibly powerful marketing tool, and one that is scarcely used in the game industry--somewhat baffling considering how often you hear similar phrases in movie trailers. Promoting the talent will allow developers to create all sorts of new games, which will stave off franchise fatigue, foster creativity (which is good for the entire industry), and reduce the dependence on big-budget sequels. This process means more creative freedom for developers, a wider portfolio for the publishers, and a whole wealth of different experiences for the consumers. It's a win, win, win. Us "hardcore gamers" already know the great studios. It's time the rest of the world finds out as well.

Tuesday, February 1, 2011

Copying vs. Learning

Learn from the best. It's a simple concept and one that has pervaded throughout all varieties of industries in any capitalistic society. Apple's iPhone was the first smartphone for the masses, not just the businessmen, and three years later we have all varities of smartphones, many taking important cues from Apple (like multi-touch displays) or trying to improve their design. Kobe Bryant frequently states how he studies and learns from the NBA's most legendary players to improve his own game. In debating new bills, lawmakers frequently make references to how the core elements have already played out in certain states or foreign nations. Citizen Kane is often credited as being the greatest film of all time, not because it was so jaw-droppingly entertaining, but because of the myriad technical and cinematic tricks that it first introduced and are now standard in any Hollywood production. It is an important part of any competitive practice to study the best that's been done, emulate it as best you can, and then build on those key traits in an original way. When it comes to game design, however, many developers shy away from borrowing ideas laid out in other games, and for those that do, many are derided for being copycats.
Darksiders' protagonist, War (center).

In January 2010, an upstart studio called Virgil Games partnered with publisher THQ to release an original game called Darksiders. The game was an original take on the Four Horsemen of the Apocalypse with a comic book-inspired art design and a semi-open world with linear objectives. When Darksiders hit shelves, it arrived to mixed reviews. The game was heralded as being fun and well-designed, but took several knocks for being a copycat of God of War or The Legend of Zelda. These criticisms weren't directed at specific shortcomings, but merely by the fact that Virgil Games didn't reinvent the wheel with entirely new mechanics from the top down. Nevermind that there has never been a game to bear any sort of resemblance to the critically-acclaimed Zelda series before this. Nevermind that the elements Virgil borrowed were just that--elements, not entire gameplay ideas or concepts. From the game's announcement, Virgil had stated that they were attempting to make a game in the Zelda mold, with an emphasis in combat that would bring it close to God of War, yet featuring an entirely new, well thought out, and imaginative universe. But to some reviewers, that meant little; Darksiders was just a copycat of Zelda or God of War that was simply not as good.

Trucks and Skulls: look familiar?
My question for these "critics," to which I cannot fathom an answer, is why? Surely there are more egregious copycats out there that are deliberately trying to steal someone else's idea and make a quick buck off it. Just head to the App Store and look at Trucks and Skulls, a total facsimile of the breakout hit Angry Birds, albeit with a thin layer of fresh paint. But Darksiders was no Trucks and Skulls. It borrowed gameplay elements, sure, but not only did they do it from multiple, entirely different games, but they meshed them in new ways in an entirely new and compelling presentation. Virgil Games was pretty upfront that Zelda and God of War were serving as inspirations for their game--and there is nothing wrong with that. At all. Especially when you consider that these are two of the most heralded franchises in all of gaming. Virgil--a new studio headed by a guy named Joe Madureira who was completely new to the medium--was simply learning tricks from the best in the business and applying them to an entirely original concept and idea. To criticize Virgil for this is completely missing the point.

That is not to say there aren't "real" copycats out there. Dante's Inferno and the recent Medal of Honor reboot bear some pretty uncanny resemblances to God of War and Call of Duty, respectively, and it's pretty likely that these games were cases of trying to get in on the cash cow. But even in these examples, its unfair to write these games off as simple copycats. While the gameplay in Dante's Inferno feels like it was literally pulled straight out of the God of War games, it does bring a unique element to the table in that its story and level design are actually interpretations of a famous literary work (Dante Alighieri's The Divine Comedy). So, in that sense, Dante's Inferno is only a half copycat. But in this case, given the incredible similarities in the gameplay alone (really, you need to play both and you'll see its almost exactly the same combat mechanics, buttons, moves, etc), it would be fair to criticize DI's gameplay as been a less glorious rip-off of GoW's. In the case of Medal of Honor, the similarities to Call of Duty's Modern Warfare sub-series are even more egregious--and fair for criticism--but the development team at Danger Close did try their own spin on the formula by using a real conflict in Afghanistan as a setting rather than an entirely fictional one. So, while examples of "copying" do exist, Darksiders is nowhere near what these two games did, and should not be criticized in the same way.
Heavy Rain's innovative control scheme allows for more cinematic
experiences--and needs to be applied by other developers.

In fact, it's almost as if game developers doesn't do enough copying. Sure, there are the examples listed above, and there have been other attempts at straight-up copying an idea. But few developers try to do what Virgil Games actually did, which is to set out to to copy or one-up, but rather to study, learn, and apply in a new and interesting way. There are some pretty well-established gameplay models that would absolutely flourish in other settings, other stories, etc. We saw a glimpse of this potential last year when Rockstar took its own award-winning Grand Theft Auto mechanics and applied them to a Wild West setting in Red Dead Redemption. They obviously tweaked various aspects to fit the setting, timeline, and story, but the hallmarks of GTA were clearly evident. In many ways this is similar to what Virgil was doing with Darksiders, and, frankly, is a practice that more developers should be looking into. After all, how many gamers heralded the innovative gameplay of last year's Heavy Rain, but hated its core story? That game's developer, Quantic Dream, have already stated they are moving on to the next innovative idea and technology, so in their place, who wouldn't want to see a new game take Heavy Rain's mechanics and apply them in wholly original ways? Or, why not take elements from those mechanics and use them to enhance certain parts of other games? The industry holds a wealth of great gameplay mechanics, great ideas, great interactive elements and ways of engaging the player... developers just need to do a better job of learning from each other. And the so-called "critics" need to back off on their jaded criticism and allow new minds to tackle old ideas.

Sunday, January 23, 2011

The Importance of Semantics

"Semantics" is commonly derided in our daily jargon as being nit-picky and annoying, but today I will make the case of its importance...especially in the case of the adolescent game industry. Semantics--or more simply, word choice--can have very subtle ways of affecting the reader without them even noticing (but, as a Communications major, these are effects I pick out quite regularly). Before we dive in, we should go over just why word choice can be so important at all. When discussing "video games," word choice is everything. The medium itself is very young by the standards of media overall (just look how long books, movies, TVs, etc. have been around). More significantly, its roots are not so much that of a new communication medium but rather a new genre of toys. The Atari 2600 and the Nintendo Entertainment System were not heralded as a coming of a new way to tell and experience stories, but rather as a new way to kill time and have fun--especially for kids. This is why many assume (sometimes incorrectly) that the adolescent male is by far the dominant demographic of game consumer; though historically this may have been true, it's not today, and the change is happening fast than most people realize. In fact, many gamers don't even want video games to be looked at as a storytelling tool; they are quite happy with their delightful hobby and just want to play games as games--for fun.

Zaeed doesn't like it when Cmdr. Shepard forgets to italicize Mass Effect.

As I discussed last week, the growing split in game design (and consumption) does not mean that either side is "right" and the other needs to go away. They can both very easily co-exist, but as a whole, it's time for the general public to start paying more respect to the game medium as a true storytelling medium. But for everyone else to respect games, gamers themselves must respect games. And that's where semantics come in. First and foremost, when writing about games, the title must be placed in italics. You'll notice that The Paradigm-Shifting Blog has adopted this practice long ago, and for good reason: in the English language, you are supposed to italicize the titles of full bodies of work. Like a book (duh). Also, a movie. A newspaper. A music album. A television series. A video game. As you likely noticed, a "full body of work" does not necessarily mean "a piece of media with a long story." Game titles need to be capitalized. Period. If we gamers cannot respect our craft with such a simply concept as recognizing them as someone's (or some group's) complete body of work, then why should anyone else? No more Mass Effect, Uncharted, or even Super Meat Boy. It's Mass Effect, Uncharted, and Super Meat Boy.


There was a time when the only objective was to "go right
and you win," but not anymore.
Okay, so maybe italicization is not so much "semantics" as it is a lesson in proper grammar, but the idea is the same. It's part of an overall concept of separating video games from toys. Even if the medium originated that way, it has grown into a full-fledged entertainment outlet. In the old days, the entire point of playing a game was to provide an artificial challenge for the player to overcome. There was no story progression or character development. That was it. "Beat" the game. Except we still use that word. My roommate ran down to my room last week to tell me that he "beat" Assassin's Creed: Brotherhood. What he really meant to say was that he finished it. What did he beat? What challenging algorithm did he have to outsmart? What patterns did he need to recognize and execute to perfection? Of course, some gamers still play "story" games--including Assassin's Creed--for that same sense of overcoming an explicitly designed challenge. This is why most games still provide varying levels of a "hard mode", where the point is for you to be better than the computer. You bested them. So in that sense, yes, you can still "beat" a hard mode, or if you are playing a game solely for the challenge, you can "beat" it by doing everything you've been tasked to do. But, if you are playing the game not only for the fun factor, but also to experience this incredible story (which is the reason most people have for playing games like Assassin's Creed) then you're not "beating" anything; you're finishing the story. This idea is very small and may seem unimportant, but it is part of the overall idea of separating video games from toys and grouping them in their proper place as an informational medium.

Another example of poor word choice is the prevalent use of the term "franchise." Now, publishers often use the term "franchise" quite accurately with their hopes and wishes for their game brand. Halo, for example, is very much a franchise, having games developed by different developers, a toy line, an anime series, a long-gestating film project...you get the picture. "Franchise" refers to the big-picture money-maker of the overall brand. But lets take a different angle on this. Obviously, Star Wars can easily be considered its own franchise. But within that gargantuan franchise lies the core of it all: the "series" of Star Wars movies. They exist in the same medium and tell (for all intents and purposes) the same story. They are direct sequels/prequels of each other. They have self-contained story arcs and ones that play out over whole trilogies (or two!). I bring this up because many game developers, writers, and analysts often refer to their series as a franchise when the two terms are not interchangeable. As with Halo, it definitely applies in some cases--and perhaps many of these developers truly wish for their baby to actually become a true franchise. But when referring to a specific series of games, it really devalues them from an artistic perspective tolabel them a "franchise"; doing this is like saying that this game is being made for the sole purpose of making money. Like a toy. And, frankly, like how some games in this business are made (Call of Duty says hi). Obviously, all developers hope to make money off their work. But referring to the Uncharted series as a "franchise" severely devalues the work of developer Naughty Dog in pushing the limits of game design, narratively and otherwise.
Nathan Drake's adventures are about a lot more than just filling the bank.

So, I task you with being aware of these important distinctions when you are reading, writing, or talking about games. These are far from the only examples and if I truly wanted to write about every last one of them, I could probably fill a book. But you don't need to read a book to get to the central idea behind this article. Games are not toys anymore. In many ways, video games are a lot more than the traditional definition of a "game"--perhaps the most striking semantical violation of all. I've long considered "video game" to be a somewhat antiquated term when referring to this interactive medium that I've come to love and respect so much. Right now "video game" is really the only term we've got... but it's about as relevant as describing a movie/film as a "motion picture" (after all, movies are a lot more than simply a moving image nowadays). With that thought, I'll leave you, the reader, to engage in a collective brainstorming over what we should really call the interactive medium long referred to as "video games." What's the best you've got? Post your ideas in the comments.

Monday, January 17, 2011

The Growing Dichotomy of Game Design

There is something happening in the industry of video games. It has been a fissure that has been growing steadily wider the past decade or so and will only continue to widen further in the future. The change it brings has ramifications for fundamental design, for the way games will be sold and marketed, and even for the way they are played. It boils down to the central philosophy driving the game's creation. And, in layman's terms, this dichotomy can be boiled down to two things: is the game focused on story or play mechanics?

The heights of realistic graphical and audio quality that today's consoles are capable of has enabled story and narrative to take a more central role in many of today's games. However, certain game genres--for example, puzzle, fighting, and racing games--are built entirely around the "fun" of their mechanics and any story is usually superfluous and unimportant (or even nonexistent). On the other side of the coin, you have games that are starting to rival Hollywood movies in story quality, production value, and creative talent like Mass Effect, Heavy Rain, and Uncharted. While these games take varying lengths of steps toward being "more movie-like", they all retain the common denominator that the medium's intrinsic interactive qualities are integral to the way the story is experienced. That is to say, the fact that you, the player, are directly impacting the story is a key differentiator between movies and games and is still a defining aspect of these "story-focused" games; the difference is, it's not the only defining aspect.

Heavy Rain's revolutionary control scheme brings a new dimension to interacting with the game's story.

In fact, play mechanics are just as important to "story" games as "mechanic" games. As Epic Game's lead designer Cliff Bleszinski just so happened to say today on Twitter,
"If you have a great game with a bad story you still have a great game. If you have a bad game with a great story you still have a bad game. [But] if you have a great game with a great story then you have a classic."
To understand this, you must recognize that the emergence of story as a central element in many modern games is a relatively new phenomenon since in the old days story was very much an afterthought and what always came first were the mechanics. Furthermore, the thing that makes a game a game is the fact that interactivity is central to the experience...and interactivity is achieved via play mechanics. That being said, the few developers that have dipped their toes in the vast ocean of great narratives have opened whole new doors of possibilities.

The added dimension of interactivity means games are capable of stories that no other medium is, and by combining great game design on a mechanics level with the high quality narrative, characterization, and production values of movies, the door is open for fantastic new experiences to emerge. Games like BioShock have already played with the notion of interactivity as a central element to the game's story. Games like Mass Effect give your individual decisions a ripple effect that shows ramifications through the entire game's world--or multiple games' worlds--and affects your path through the story and the way other characters react to you. Games like Uncharted are more exciting than any summer blockbuster at the movie theaters because of the fact you are participating in the action, while witty writing and strong voice acting bring the story to a more personal level.

The way you've acted as Commander Shepard decides whether or not Wrex lives in Mass Effect.

But not all games strive for these goals--and not all need to. There is room in the market for games that merely want to do what games were originally designed to do without all this fancy schmancy new age story business. Fighting games in particular excel at using the idiosyncratic fighting mechanics to drive competitiveness in the players. Racing games center themselves around two simple things: driving cars really fast and being the first one to the finish line. These games don't need canonical explanations; they simply need to be fun. And that's just fine. Even some games that do have stories, like the Call of Duty series, are not necessarily trying to push the envelope of interactive storytelling, and are arguably designed with a gameplay-is-the-only-important-thing philosophy--not that you can blame them given the fact that the multiplayer (arguably a separate "mechanics" game) is so important to their sales.

SoulCalibur IV doesn't need a story to be really fun.

This is the divergence we are seeing in game design today. Some games are simply going to try to push the fun factor of what games can do. Some games are going to push the artistic envelope of what ways they can use interactivity to tell an engaging story. Similarly, when gamers want to play a certain game, they'll often do it because a) they love its mechanics or b) they love its story. It's not a matter of one design philosophy "winning" over another, it's about the way they coexist and feed off each other. Mass Effect 2 is a great example, learning from the cover-based gunplay of Gears of War to make playing through its story more fun whilst still retaining its focus on story and character development. The emergence of truly great game narrative and a push to take artistic advantage of games' interaction does not mean that "fun" games will go away. What it does mean is that games don't necessarily have to be fun, whimsical distractions, and can instead provide an artistic canvas for exploring higher level questions, such as the illusion of free will (BioShock) or the ethics of cloning (Mass Effect).

Mass Effect 2 combines its great narrative with Gears of War's fun cover system.

What do you think of this growing dichotomy of game design? Share your thoughts in the comments.