Showing posts with label editorial. Show all posts
Showing posts with label editorial. Show all posts

Monday, March 14, 2011

How Interactivity Opens New Storytelling Possibilities

I made this video for a class project, in which I had to analyze an aspect of new media. Obviously, I chose games, and equally unsurprisingly if you follow my true passion in the medium, I took a look at what the element of interaction means from a narrative perspective. While my original vision was actually grander in scope and content, time limitations forced me to cut a lot out of the video and summarize even more. Here's the description from my YouTube page and the embedded video itself.

"Advances in technology have been changing the we create and consume stories for over a hundred years. More recently, video games have achieved a level of detail that allows for entirely new ways to tell and experience stories. The medium's inherent interactive qualities have opened up new doors, and some skilled developers are already laying the foundation."

Tuesday, February 15, 2011

The Next Console Generation? Don't Hold Your Beath

Throughout the history of video games and video game consoles, one rule has held steadfast--every 5 or so years, the industry refreshes with what's referred to as a new "generation" of consoles, where each of the major console manufacturers trumpets out new, more powerful devices while slowly phasing out the old ones. 2011 marks the fifth year since the 2006 launch of the Nintendo Wii and PlayStation 3--and the sixth year since the Xbox 360's launch in 2005--yet not only have we yet to hear of any new machines, it does not appear likely that new systems will be entering the market until 2015. The reality is that nobody involved in the industry--from the manufacturers to the developers to the publishers to even the consumers--even want a new console generation anytime soon. And for good reason.

PS2: The First 10-year Console (and counting)
When Sony first launched the PlayStation 3, the Japanese tech giant defended the system's initially astronomical price tag by claiming that the PS3 was designed for a ten-year life cycle. While there were plenty of critics doubting Sony's lofty ambitions, in hindsight the goal was not so unrealistic given the ongoing success of the PlayStation 2. By the time the PS2 reached its 10th anniversary in 2010, sales had declined significantly from its heydey, but it was far from dead in the water (games continue to be published for the system to this day). In other words, a system planned to have a 10-year life cycle surely could achieve that goal with little trouble given the PS2's already-established lasting power (then again, the PS2 is the most successful console ever made).

However, the PS2 achieved this feat 4-5 years after the start of the current console generation, so it's not like it can really be regarded as the catalyst for the elongation of the life cycle. Instead, we must answer a fundamental question: why would Sony, Microsoft, or (to a lesser extent) Nintendo even want to release a new console right now? The PS3 and Xbox 360 are still growing and have yet to hit critical mass. Further, the cost of actually releasing a new console is tremendous, and the manufacturers often launch new consoles at a loss and make up the difference over time in game licensing fees, accessory sales, and (down the road) cheaper and/or more efficient components. There's no reason for these companies to go back into the red when the current generation is still in its prime and still growing. The economic turmoil that has tightened up budgets around the world--and that played a role in slowing growth in the generation's early years--has only reaffirmed this position.

The PS3 Slim makes the unit cheaper for Sony to
manufacture.
(Note: the Nintendo Wii may be an exemption to this logic and will be excluded for the rest of the article. The Wii experienced its first sales decline in 2010 and--while still strong and still the overall market leader--there is no doubt that the system will reach its demise first of the three, especially considering that it was a far weaker console to begin with. However, still don't expect a successor until at least next year; the Wii still retails for $200, meaning there's plenty of room for sales-invigorating price drops).

To further strengthen the console manufacturers' position, the developers and publishers are more than happy to keep the current consoles around for as long as possible. The installed bases (number of people who own the current consoles) are still growing. A new set of consoles would mean higher development costs and a need for developers to learn how to develop for entirely new computer architectures when they are only just now tapping the full potential of the PS3 and 360. The publishers would also not want to combine these higher costs with installed bases that would inherently be only a fraction of what's available right now, which would severely limit the sales potential of any next-generation game, no matter how good. The increasing installed bases of the current consoles also decreases the risk of developing middle-ranged niche titles that can sell to just a small segment of the market rather than needing to achieve AAA blockbuster success.

Because of Xbox Live's heavy integration to the 360 dashboard,
Microsoft was able to completely overhaul the dash to
accommodate changing times.

Any rational consumer doesn't want a new console either. With the developers finally getting a firm grasp on development for this generation, gamers are able to play increasingly higher-quality games. The aforementioned benefits of growing installed bases also means that developers and publishers will be willing to take more risks, which opens up the door to new, more radical gameplay ideas. Consumers also get the advantage of sequels and entire trilogies that can actually build off previous titles in the same generation; it would be almost impossible for BioWare's Mass Effect trilogy to maintain the sense of continuity from game-to-game if the generation were cut short as currently these games import previous games' save data directly off the hard drive (of course, a cloud-based memory solution like what Sony's launching with PlayStation Plus could alleviate this hurdle). Perhaps most importantly, these consoles still cost about $300, or the traditional launching price of a new console (though that has already been thrown out the window). Should sales start declining like what the Wii's beginning to see, there's plenty of room for price drops, which trigger a massive increase in sales. With the abundance of quality games still being released for the current consoles, the last thing a consumer wants to do is shell out hundreds of dollars to start over again.

Kinect and the Xbox 360 S: a recipe for longevity.
There are several other factors in this generation's elongation. For one, the primary driving force in the launch of previous generations was technological advancements that allowed for a significant increase in pure graphical power. Let's face it: no matter how much more realistic future consoles may get, any improvements would only be incremental to the strikingly realistic visuals already achieved by today's consoles. Another game-changer is the current consoles' integration with online connectivity. The PS3 and Xbox 360 as they exist today are far more advanced than the machines that launched, even with with the slimmer redesigns aside. Having the Internet so thoroughly integrated into the experience means that the consoles are able to evolve as time goes on--something that previously necessitated an entirely new hardware product. Already we've seen things like Netflix, Hulu, ESPN, MLB.tv, NHL GameCenter, Last.fm, Facebook, Twitter, Zune, IPTV, and a whole bunch of other applications get added to the consoles' stable of features.

Finally, 2010 saw the launch of major peripherals for both the PS3 and 360 that provide a new method of interaction that could easily be associated with a new console, but instead are simply additions to the current hardware. Sony and Microsoft certainly hope that the PlayStation Move and Kinect Sensor for Xbox 360 can carry their respective hardware for the equivalent of a new generation, but as we've been over this whole time, even if they flop (which they aren't), these consoles are plenty capable of continuing growth on their own.

Tuesday, February 8, 2011

The Marketing Departments Have It All Wrong

Successful marketing is an incredibly useful tool to get consumers to buy your product, and the world of entertainment media is no exception to that rule. There are countless examples across all forms of media of just how important marketing (or lack thereof) can be to a product's ultimate financial success. While video games have proven to be an incredible gold mine, often generating more income than Hollywood movies, the industry as it exists now is far from sustainable. Publishers need to pour tons of money into development for today's consoles, meaning that there is an extremely high risk if the product fails--which is why we see fewer original IPs and a market that is dominated by big-budget sequels. And therein lies the biggest problem of the games industry: the publishers don't understand how to market their product. "But I see Call of Duty and Halo ads all the time!" one might retort. It's true; publishers do a pretty good job of pushing their biggest brands and creating financial juggernauts. Unfortunately, that means a select few games are basically driving the entire industry, a situation that doesn't need an insightful analyst to realize that it's not exactly a formula for long-term survival, let alone growth. The problem is exacerbated when companies insist on pumping out new titles for these brands year in and year out, which not only burns out the developers and creativity, but dramatically accelerates what is known as "franchise fatigue" for the consumers. Just look at what happened to Guitar Hero and Rock Band: what were once the financial gems of the industry for their ability to not only move millions of units of games, but also high-return plastic instruments, are now a toxic asset that has become dead in the water just as fast as it rose to glory. (In case you're wondering, Activision published a whopping 25 Guitar Hero SKUs in 2009 alone--a year that also saw revenue in the overall genre drop 50% to $700 million from $1.4 billion the year before).
Once a breakout hit, Guitar Hero is now
only a shadow of its former glory

Now, one could easily argue that the music game genre's catastrophic collapse had more to do with Activision's greedy higher-ups than with a marketing department failing to successfully handle so many similar products. But this argument only circles back to the underlying trend that is the fault of the marketing departments: they are not marketing the right product. We've already gone over how publishers focus on select franchises and try to pump out as much money as they can from these names. The fact of the matter is, they are promoting the wrong names, period. Rather than promoting the game brand itself, the publishers should be promoting the "brand" that is the talent behind these games. A bit altruistic, you might say, but let's take a closer look by peeking over at the much-compared medium of movies. Certainly there are cases where movie studios are able to push a brand to sell a movie (Star Wars, James Bond, pretty much any movie based on a comic book come to mind). But if you think about the way most movies are promoted and hyped up, it's usually one of two things (or a combination): the director and the actors. These studios often bank on big-name talent to bring consumers to their product. People already know they are probably in for a good movie if Steven Spielberg is the director or if Russel Crowe is the lead actor. These kinds of "stars" do not exist in the games industry--which is entirely the fault of the people who promote the games. Of course, game development is inherently more team-oriented and less based on individual talent, but that is why the industry should start focusing on promoting its specific studios as its stars.

If you don't know Bungie, you certainly know
 its most successful creation.
To some extent, this has already happened. Sort of. People who follow the industry closely know the name "Bungie" carries a whole lot more weight than "Saber Interactive" because they already know what Bungie's accomplished and know what they bring to the table. Bungie hasn't announced so much as a title or even genre for their next game, but gamers are already salivating for it. The real money, however, is not in the people that already know what games they are going to buy, but rather the mainstream market that pretty much bought their Xbox for Call of Duty, Halo, and Madden... because that's all they know. You're average Joe probably doesn't know the difference between an Infinity Ward Call of Duty game and a Treyarch Call of Duty game. The existing practice obviously devalues the developer, but that's not as important to the publisher's bottom line as the eventual sales, so they promote the game brand--which they have more control over--instead. However, incentives exist for publishers as well. If publishers start promoting their studios rather than just the game brands, they'll be able to create completely new IPs (and to them, potential new series/franchises) on the backs of those developer names alone (i.e., without the high financial risk usually associated with a new IP). They don't have to waste money explaining why Game X is the next big thing while not-so-subtley hinting that its sequel is already in the works and will be even bigger. Instead, they can simply say, "this is the newest game from Bungie." Boom, millions of copies sold, done deal (the studio has to actually make a good game of course, but a studio that is able to consistently feed its creativity is far more likely to be working as hard as it can).

This is a bold proclamation that might seem like a too big of a step for the industry's tepid publishers. Luckily, there is already an example of this practice achieving mainstream success. For years, Rockstar was known for one game franchise and one game franchise only: Grand Theft Auto. Anybody who's reading this knows that those games don't need any introduction. It's a game brand that practically prints money. Or is it? Last year, Rockstar unleashed their latest project, and it wasn't a game about urban crime. Rather, it was a brand-new game set in the Old West. It was called Red Dead Redemption and went on to sell millions upon millions of copies and is heralded as one of the best games of last year. While the game is technically a sequel, it shares no ties to its under-the-radar predecessor, which wasn't even originally developed by Rockstar (the former Capcom game was dumped and Rockstar swooped up the rights, pushed it out, then geared up for the project it really wanted to do). How could what is essentially a new IP make such a splash right out of the gate? Because Rockstar was able to promote the Rockstar brand instead of the game's brand. Very clearly above the title, and before every trailer and commercial, reads a line that says, "Rockstar Games Presents." As in, "Rockstar Games Presents Red Dead Redemption." At the very least, pushing their name on the front of the box puts their name out for the future, so even if you didn't know they made GTA, you'll know they made RDR when they ship their next game (which, by the way, is L.A. Noire, which will also carry the "Rockstar Games Presents" tag on its box).

By doing this, Rockstar has taken steps to promote the Rockstar brand. Suddenly, the company has two mega-blockbusters, yet its next two games (L.A. Noire and Max Payne 3) are not from either of those franchises. They have been able to succesfully promote the Rockstar name, which allows them to explore game ideas in all sorts of stories and settings, rather than endlessly iterating on GTA until consumers finally get sick of it. Not only that, but when Grand Theft Auto V eventually does drop, it will come to additional applause and fanfare for returning from a long absence, which generates all sorts of hype on its own. And in the meantime, Rockstar is still making a pretty penny on other, brand new titles, because people know that it's not just that GTA is a good game, it's that Rockstar is good at making good games. Being able to say "from the studio behind Grand Theft Auto" is an incredibly powerful marketing tool, and one that is scarcely used in the game industry--somewhat baffling considering how often you hear similar phrases in movie trailers. Promoting the talent will allow developers to create all sorts of new games, which will stave off franchise fatigue, foster creativity (which is good for the entire industry), and reduce the dependence on big-budget sequels. This process means more creative freedom for developers, a wider portfolio for the publishers, and a whole wealth of different experiences for the consumers. It's a win, win, win. Us "hardcore gamers" already know the great studios. It's time the rest of the world finds out as well.

Tuesday, February 1, 2011

Copying vs. Learning

Learn from the best. It's a simple concept and one that has pervaded throughout all varieties of industries in any capitalistic society. Apple's iPhone was the first smartphone for the masses, not just the businessmen, and three years later we have all varities of smartphones, many taking important cues from Apple (like multi-touch displays) or trying to improve their design. Kobe Bryant frequently states how he studies and learns from the NBA's most legendary players to improve his own game. In debating new bills, lawmakers frequently make references to how the core elements have already played out in certain states or foreign nations. Citizen Kane is often credited as being the greatest film of all time, not because it was so jaw-droppingly entertaining, but because of the myriad technical and cinematic tricks that it first introduced and are now standard in any Hollywood production. It is an important part of any competitive practice to study the best that's been done, emulate it as best you can, and then build on those key traits in an original way. When it comes to game design, however, many developers shy away from borrowing ideas laid out in other games, and for those that do, many are derided for being copycats.
Darksiders' protagonist, War (center).

In January 2010, an upstart studio called Virgil Games partnered with publisher THQ to release an original game called Darksiders. The game was an original take on the Four Horsemen of the Apocalypse with a comic book-inspired art design and a semi-open world with linear objectives. When Darksiders hit shelves, it arrived to mixed reviews. The game was heralded as being fun and well-designed, but took several knocks for being a copycat of God of War or The Legend of Zelda. These criticisms weren't directed at specific shortcomings, but merely by the fact that Virgil Games didn't reinvent the wheel with entirely new mechanics from the top down. Nevermind that there has never been a game to bear any sort of resemblance to the critically-acclaimed Zelda series before this. Nevermind that the elements Virgil borrowed were just that--elements, not entire gameplay ideas or concepts. From the game's announcement, Virgil had stated that they were attempting to make a game in the Zelda mold, with an emphasis in combat that would bring it close to God of War, yet featuring an entirely new, well thought out, and imaginative universe. But to some reviewers, that meant little; Darksiders was just a copycat of Zelda or God of War that was simply not as good.

Trucks and Skulls: look familiar?
My question for these "critics," to which I cannot fathom an answer, is why? Surely there are more egregious copycats out there that are deliberately trying to steal someone else's idea and make a quick buck off it. Just head to the App Store and look at Trucks and Skulls, a total facsimile of the breakout hit Angry Birds, albeit with a thin layer of fresh paint. But Darksiders was no Trucks and Skulls. It borrowed gameplay elements, sure, but not only did they do it from multiple, entirely different games, but they meshed them in new ways in an entirely new and compelling presentation. Virgil Games was pretty upfront that Zelda and God of War were serving as inspirations for their game--and there is nothing wrong with that. At all. Especially when you consider that these are two of the most heralded franchises in all of gaming. Virgil--a new studio headed by a guy named Joe Madureira who was completely new to the medium--was simply learning tricks from the best in the business and applying them to an entirely original concept and idea. To criticize Virgil for this is completely missing the point.

That is not to say there aren't "real" copycats out there. Dante's Inferno and the recent Medal of Honor reboot bear some pretty uncanny resemblances to God of War and Call of Duty, respectively, and it's pretty likely that these games were cases of trying to get in on the cash cow. But even in these examples, its unfair to write these games off as simple copycats. While the gameplay in Dante's Inferno feels like it was literally pulled straight out of the God of War games, it does bring a unique element to the table in that its story and level design are actually interpretations of a famous literary work (Dante Alighieri's The Divine Comedy). So, in that sense, Dante's Inferno is only a half copycat. But in this case, given the incredible similarities in the gameplay alone (really, you need to play both and you'll see its almost exactly the same combat mechanics, buttons, moves, etc), it would be fair to criticize DI's gameplay as been a less glorious rip-off of GoW's. In the case of Medal of Honor, the similarities to Call of Duty's Modern Warfare sub-series are even more egregious--and fair for criticism--but the development team at Danger Close did try their own spin on the formula by using a real conflict in Afghanistan as a setting rather than an entirely fictional one. So, while examples of "copying" do exist, Darksiders is nowhere near what these two games did, and should not be criticized in the same way.
Heavy Rain's innovative control scheme allows for more cinematic
experiences--and needs to be applied by other developers.

In fact, it's almost as if game developers doesn't do enough copying. Sure, there are the examples listed above, and there have been other attempts at straight-up copying an idea. But few developers try to do what Virgil Games actually did, which is to set out to to copy or one-up, but rather to study, learn, and apply in a new and interesting way. There are some pretty well-established gameplay models that would absolutely flourish in other settings, other stories, etc. We saw a glimpse of this potential last year when Rockstar took its own award-winning Grand Theft Auto mechanics and applied them to a Wild West setting in Red Dead Redemption. They obviously tweaked various aspects to fit the setting, timeline, and story, but the hallmarks of GTA were clearly evident. In many ways this is similar to what Virgil was doing with Darksiders, and, frankly, is a practice that more developers should be looking into. After all, how many gamers heralded the innovative gameplay of last year's Heavy Rain, but hated its core story? That game's developer, Quantic Dream, have already stated they are moving on to the next innovative idea and technology, so in their place, who wouldn't want to see a new game take Heavy Rain's mechanics and apply them in wholly original ways? Or, why not take elements from those mechanics and use them to enhance certain parts of other games? The industry holds a wealth of great gameplay mechanics, great ideas, great interactive elements and ways of engaging the player... developers just need to do a better job of learning from each other. And the so-called "critics" need to back off on their jaded criticism and allow new minds to tackle old ideas.

Sunday, January 23, 2011

The Importance of Semantics

"Semantics" is commonly derided in our daily jargon as being nit-picky and annoying, but today I will make the case of its importance...especially in the case of the adolescent game industry. Semantics--or more simply, word choice--can have very subtle ways of affecting the reader without them even noticing (but, as a Communications major, these are effects I pick out quite regularly). Before we dive in, we should go over just why word choice can be so important at all. When discussing "video games," word choice is everything. The medium itself is very young by the standards of media overall (just look how long books, movies, TVs, etc. have been around). More significantly, its roots are not so much that of a new communication medium but rather a new genre of toys. The Atari 2600 and the Nintendo Entertainment System were not heralded as a coming of a new way to tell and experience stories, but rather as a new way to kill time and have fun--especially for kids. This is why many assume (sometimes incorrectly) that the adolescent male is by far the dominant demographic of game consumer; though historically this may have been true, it's not today, and the change is happening fast than most people realize. In fact, many gamers don't even want video games to be looked at as a storytelling tool; they are quite happy with their delightful hobby and just want to play games as games--for fun.

Zaeed doesn't like it when Cmdr. Shepard forgets to italicize Mass Effect.

As I discussed last week, the growing split in game design (and consumption) does not mean that either side is "right" and the other needs to go away. They can both very easily co-exist, but as a whole, it's time for the general public to start paying more respect to the game medium as a true storytelling medium. But for everyone else to respect games, gamers themselves must respect games. And that's where semantics come in. First and foremost, when writing about games, the title must be placed in italics. You'll notice that The Paradigm-Shifting Blog has adopted this practice long ago, and for good reason: in the English language, you are supposed to italicize the titles of full bodies of work. Like a book (duh). Also, a movie. A newspaper. A music album. A television series. A video game. As you likely noticed, a "full body of work" does not necessarily mean "a piece of media with a long story." Game titles need to be capitalized. Period. If we gamers cannot respect our craft with such a simply concept as recognizing them as someone's (or some group's) complete body of work, then why should anyone else? No more Mass Effect, Uncharted, or even Super Meat Boy. It's Mass Effect, Uncharted, and Super Meat Boy.


There was a time when the only objective was to "go right
and you win," but not anymore.
Okay, so maybe italicization is not so much "semantics" as it is a lesson in proper grammar, but the idea is the same. It's part of an overall concept of separating video games from toys. Even if the medium originated that way, it has grown into a full-fledged entertainment outlet. In the old days, the entire point of playing a game was to provide an artificial challenge for the player to overcome. There was no story progression or character development. That was it. "Beat" the game. Except we still use that word. My roommate ran down to my room last week to tell me that he "beat" Assassin's Creed: Brotherhood. What he really meant to say was that he finished it. What did he beat? What challenging algorithm did he have to outsmart? What patterns did he need to recognize and execute to perfection? Of course, some gamers still play "story" games--including Assassin's Creed--for that same sense of overcoming an explicitly designed challenge. This is why most games still provide varying levels of a "hard mode", where the point is for you to be better than the computer. You bested them. So in that sense, yes, you can still "beat" a hard mode, or if you are playing a game solely for the challenge, you can "beat" it by doing everything you've been tasked to do. But, if you are playing the game not only for the fun factor, but also to experience this incredible story (which is the reason most people have for playing games like Assassin's Creed) then you're not "beating" anything; you're finishing the story. This idea is very small and may seem unimportant, but it is part of the overall idea of separating video games from toys and grouping them in their proper place as an informational medium.

Another example of poor word choice is the prevalent use of the term "franchise." Now, publishers often use the term "franchise" quite accurately with their hopes and wishes for their game brand. Halo, for example, is very much a franchise, having games developed by different developers, a toy line, an anime series, a long-gestating film project...you get the picture. "Franchise" refers to the big-picture money-maker of the overall brand. But lets take a different angle on this. Obviously, Star Wars can easily be considered its own franchise. But within that gargantuan franchise lies the core of it all: the "series" of Star Wars movies. They exist in the same medium and tell (for all intents and purposes) the same story. They are direct sequels/prequels of each other. They have self-contained story arcs and ones that play out over whole trilogies (or two!). I bring this up because many game developers, writers, and analysts often refer to their series as a franchise when the two terms are not interchangeable. As with Halo, it definitely applies in some cases--and perhaps many of these developers truly wish for their baby to actually become a true franchise. But when referring to a specific series of games, it really devalues them from an artistic perspective tolabel them a "franchise"; doing this is like saying that this game is being made for the sole purpose of making money. Like a toy. And, frankly, like how some games in this business are made (Call of Duty says hi). Obviously, all developers hope to make money off their work. But referring to the Uncharted series as a "franchise" severely devalues the work of developer Naughty Dog in pushing the limits of game design, narratively and otherwise.
Nathan Drake's adventures are about a lot more than just filling the bank.

So, I task you with being aware of these important distinctions when you are reading, writing, or talking about games. These are far from the only examples and if I truly wanted to write about every last one of them, I could probably fill a book. But you don't need to read a book to get to the central idea behind this article. Games are not toys anymore. In many ways, video games are a lot more than the traditional definition of a "game"--perhaps the most striking semantical violation of all. I've long considered "video game" to be a somewhat antiquated term when referring to this interactive medium that I've come to love and respect so much. Right now "video game" is really the only term we've got... but it's about as relevant as describing a movie/film as a "motion picture" (after all, movies are a lot more than simply a moving image nowadays). With that thought, I'll leave you, the reader, to engage in a collective brainstorming over what we should really call the interactive medium long referred to as "video games." What's the best you've got? Post your ideas in the comments.

Monday, January 17, 2011

The Growing Dichotomy of Game Design

There is something happening in the industry of video games. It has been a fissure that has been growing steadily wider the past decade or so and will only continue to widen further in the future. The change it brings has ramifications for fundamental design, for the way games will be sold and marketed, and even for the way they are played. It boils down to the central philosophy driving the game's creation. And, in layman's terms, this dichotomy can be boiled down to two things: is the game focused on story or play mechanics?

The heights of realistic graphical and audio quality that today's consoles are capable of has enabled story and narrative to take a more central role in many of today's games. However, certain game genres--for example, puzzle, fighting, and racing games--are built entirely around the "fun" of their mechanics and any story is usually superfluous and unimportant (or even nonexistent). On the other side of the coin, you have games that are starting to rival Hollywood movies in story quality, production value, and creative talent like Mass Effect, Heavy Rain, and Uncharted. While these games take varying lengths of steps toward being "more movie-like", they all retain the common denominator that the medium's intrinsic interactive qualities are integral to the way the story is experienced. That is to say, the fact that you, the player, are directly impacting the story is a key differentiator between movies and games and is still a defining aspect of these "story-focused" games; the difference is, it's not the only defining aspect.

Heavy Rain's revolutionary control scheme brings a new dimension to interacting with the game's story.

In fact, play mechanics are just as important to "story" games as "mechanic" games. As Epic Game's lead designer Cliff Bleszinski just so happened to say today on Twitter,
"If you have a great game with a bad story you still have a great game. If you have a bad game with a great story you still have a bad game. [But] if you have a great game with a great story then you have a classic."
To understand this, you must recognize that the emergence of story as a central element in many modern games is a relatively new phenomenon since in the old days story was very much an afterthought and what always came first were the mechanics. Furthermore, the thing that makes a game a game is the fact that interactivity is central to the experience...and interactivity is achieved via play mechanics. That being said, the few developers that have dipped their toes in the vast ocean of great narratives have opened whole new doors of possibilities.

The added dimension of interactivity means games are capable of stories that no other medium is, and by combining great game design on a mechanics level with the high quality narrative, characterization, and production values of movies, the door is open for fantastic new experiences to emerge. Games like BioShock have already played with the notion of interactivity as a central element to the game's story. Games like Mass Effect give your individual decisions a ripple effect that shows ramifications through the entire game's world--or multiple games' worlds--and affects your path through the story and the way other characters react to you. Games like Uncharted are more exciting than any summer blockbuster at the movie theaters because of the fact you are participating in the action, while witty writing and strong voice acting bring the story to a more personal level.

The way you've acted as Commander Shepard decides whether or not Wrex lives in Mass Effect.

But not all games strive for these goals--and not all need to. There is room in the market for games that merely want to do what games were originally designed to do without all this fancy schmancy new age story business. Fighting games in particular excel at using the idiosyncratic fighting mechanics to drive competitiveness in the players. Racing games center themselves around two simple things: driving cars really fast and being the first one to the finish line. These games don't need canonical explanations; they simply need to be fun. And that's just fine. Even some games that do have stories, like the Call of Duty series, are not necessarily trying to push the envelope of interactive storytelling, and are arguably designed with a gameplay-is-the-only-important-thing philosophy--not that you can blame them given the fact that the multiplayer (arguably a separate "mechanics" game) is so important to their sales.

SoulCalibur IV doesn't need a story to be really fun.

This is the divergence we are seeing in game design today. Some games are simply going to try to push the fun factor of what games can do. Some games are going to push the artistic envelope of what ways they can use interactivity to tell an engaging story. Similarly, when gamers want to play a certain game, they'll often do it because a) they love its mechanics or b) they love its story. It's not a matter of one design philosophy "winning" over another, it's about the way they coexist and feed off each other. Mass Effect 2 is a great example, learning from the cover-based gunplay of Gears of War to make playing through its story more fun whilst still retaining its focus on story and character development. The emergence of truly great game narrative and a push to take artistic advantage of games' interaction does not mean that "fun" games will go away. What it does mean is that games don't necessarily have to be fun, whimsical distractions, and can instead provide an artistic canvas for exploring higher level questions, such as the illusion of free will (BioShock) or the ethics of cloning (Mass Effect).

Mass Effect 2 combines its great narrative with Gears of War's fun cover system.

What do you think of this growing dichotomy of game design? Share your thoughts in the comments.

Monday, August 16, 2010

Spotlight: Superstars of the Gaming Industry

For those of you that don't know me on a personal basis, my ultimate career goal is to be a game designer that helps to push the industry forward with his products, both from the inside with innovative gaming ideas as well as the outside with fresh approaches to the business. Particularly in regards to the way games are pitched, approved, developed, and marketed, the industry has clung on to some seriously archaic practices--but that is a topic deserving its own in-depth article. Let's just say that one aspect of my strategy is changing the way games are promoted, and it starts with promoting the creators themselves. Many consumers don't understand the difference between a publisher and a developer, let alone the creative talent that serves as the driving force for our favorite games. Below I am presenting who I think are the biggest, most influential, and most successful creative minds our industry has today, names that gamers from casual moms and dads to hardcore enthusiasts should all know. These are obviously some of my biggest idols in the industry and increasing mass-market awareness of creative talent starts and ends with these people.


VINCE ZAMPELLA & JASON WEST-- let's start with two names you might have just heard because they've been in the news a lot this year. They are undoubtedly best known for founding and directing Infinity Ward, creators of the Call of Duty/Modern Warfare series, and following an ugly dispute with parent company Activision-Blizzard, have left to form an independent studio named Respawn Entertainment. Before Infinity Ward, they created the Medal of Honor franchise for Electronic Arts (who Respawn is again partnered with). They have attracted the eye of Hollywood agents and must be regarded as the mastermind architects of the first-person shooter action blockbuster.


CLIFF BLESZINSKI-- Probably one of the most recognizable names on this list simply because Microsoft has leveraged him as the face of Epic Games in press coverage of their Gears of War series. A long time employee of Epic and high-level designer for Unreal, he got his biggest break as lead designer of Gears of War, the first real "killer app" for the Xbox 360. Since then he has been involved in overseeing and promoting Epic's projects big and small, from the Gears and Unreal sequels to new projects like Chair Entertainment's Shadow Complex and People Can Fly's Bulletstorm.


SHIGERU MIYAMOTO-- widely-regarded as the grandfather of modern video games, he hit the scene creating the arcade hit Donkey Kong for Nintendo in the 80s, but his true claim to fame has been constructing just about all of Nintendo's massive stable of franchises, including Super Mario, The Legend of Zelda, Star Fox, Pikmin, Kirby, Wii Fit and many others; needless to say, it's quite obvious why you should know who he is. His understudy, EIJI AONUMA, is another important Nintendo designer, having taken point on every Zelda game since Majora's Mask on the Nintendo 64.


HIDEO KOJIMA-- Much like Miyamoto at Nintendo, Kojima-san has risen to deity-like status at Konami, where his brainchildren are the multiple PlayStation-selling games that make up the Metal Gear Solid series. Despite repeated Brett Favre-esque retirement claims, Kojima-san has maintained tight oversight and creative control over Metal Gear and his internal team at Kojima Productions. Konami values him so much that they put him in charge of their effort to revive Castlevania on consoles with this fall's Lords of Shadow.




WARREN SPECTOR-- another old name in the industry, Spector has long been at the forefront of pushing engaging, interactive storytelling. His resume includes a plethora of critically-acclaimed darlings, such as Ultima, System Shock, Thief, and Deus Ex. He is currently working on Epic Mickey, Disney Interactive's Wii-exclusive title coming this fall, and recently tweeted his desire to design a game based on Duck Tales.







PETER MOLYNEUX-- Molyneux made his name creating PC "god games" Populous and Black & White as creative head of Microsoft-owned, UK-based Lionhead Studios. He's also served as creative director of all the Fable titles, claiming during the development of the original that he wanted to create "the best RPG ever made." He's consistently been a great designer to interview as his passion for the medium always seems like it can hardly be contained--and making his PR handlers pretty nervous. Last year, Molyneux was promoted to the creative boss of all of Microsoft Game Studios' European branch, though he is still actively involved in the development of Lionhead's games, including this fall's Fable III.

MARTY O'DONNELL-- He's the only person on this list that's not directly involved with the big picture design of his games, but that doesn't mean he isn't equally important. He's known for his work on Bungie's Halo games as sound director and composer and his work is easily on the level of Hollywood legends like Hans Zimmer and maybe even John Williams. He is without a doubt the best in the business when it comes to audio production, and mark my words we've yet to see his best work, especially now that Bungie is moving on into uncharted territory after Halo: Reach.



Dr. RAY MUZYKA & Dr. GREG ZESCHUK-- "The Doctors", as they are often referred to, are founders of arguably the most talented game developer on the planet: BioWare. They went through med school together but after completing their degree decided they wanted to give a shot at exercising their creative muscles, so they founded BioWare with another of their med school buddies. Since then, they have cemented BioWare's reputation as the pre-eminent RPG house with Baldur's Gate, Neverwinter Nights, Star Wars: Knights of the Old Republic, Mass Effect, and Dragon Age. Their mission with their games has been to consistently push the medium forward and truly take advantage of the player interaction element inherent to the game medium. Their studio has fostered other talent, including CASEY HUDSON, who has served as creative director for the Mass Effect series.


SHINJI MIKAMI-- Mikami-san is probably the most under-appreciated of top tier Japanese game designers. While his exposure has increased as of late at his new studio, Platinum Games, he had formerly created a slew of masterpieces at Capcom and its now-closed subsidiary Clover. His more notable games include Resident Evils 0-4, Dino Crisis, Devil May Cry, Steel Battalion, Viewtiful Joe, Killer 7, Bayonetta, and the upcoming Vanquish. He is currently the head of aforementioned Platinum, a studio of former Clover developers that is one of the preeminent studios in Japan.





KEN LEVINE--the creative boss at Boston-based Irrational Games, Levine comes from the Warren Spector school of game design, having worked with him on Thief: The Dark Project. He has gone on to develop critical darlings such as Tribes, System Shock 2, Freedom Force, and most recently BioShock. Just last week, 3 years after BioShock's release, he unvieled Irrational's next project: the equally stunning and original BioShock Infinite.





TODD HOWARD-- he is the creative head of Bethesda Game Studios, known best for The Elder Scrolls series and most recently Fallout 3. His next title is shrouded in mystery, but much like Levine and Irrational before the Infinite announcement, that doesn't mean it isn't highly anticipated. Howard and his team at Bethesda have solidified themselves as cream of the crop when it comes to the open-world RPG as there let-you-do-anything-and-I-mean-anything approach has captured the imagination (and free time) of countless gamers.






AMY HENNIG-- she's the creative director at Naughty Dog and the only female on this list. At Naughty Dog, she's overseen the creation of PlayStation 2's standout series Jak and Daxter, while in the HD era she has lead development on PS3 blockbusters Uncharted: Drake's Fortune and Uncharted 2: Among Thieves...the latter regarded as one of the best games of this console generation when it comes to both story and gameplay, as well as the intertwining of the two.



DAVID CAGE-- Cage is the leader--creatively and otherwise--of French development studio Quantic Dream. If you've played any of their games (most notably Indigo Prophecy or Heavy Rain) then you know that their products represent an entirely different approach to game design, one that is rooted closer to cinema (with an interactive filter) than "traditional" games. In fact, it's hard to describe one of their games without actually playing it, but suffice to say that David Cage holds an increasingly ambitious vision for the future of the medium that comes through in his envelope-pushing creations.








Well there you have it: the biggest and brightest stars of today's development landscape. That's not to say that this is a comprehensive list, especially when it comes to gaming's history; the reality is that there are many, many people who have influenced the medium. If the above heavyweights have piqued your interest, perhaps you might be interested in looking up some of the biggest names of gaming's past, many of whom still have upcoming projects in the pipeline: Tim Schafer (Brutal Legend, Grim Fandango, Monkey Island, Psychonauts); Hironobu Sakaguchi (Final Fantasy, Chrono Trigger, Parasite Eve, Lost Odyssey); Tomonobu Itagaki (Dead or Alive, Ninja Gaiden); Yuji Naka (Sonic the Hedgehog, NiGHTS); Michel Ancel (Beyond Good & Evil, Rayman, King Kong); David Jaffe (Twisted Metal, God of War, Calling All Cars); Jordan Mechner (Prince of Persia); Will Wright (SimCity, The Sims, Spore); Sid Meier (Civilization).

Thursday, May 27, 2010

LOST: The End of an Era


Last weekend, arguably the greatest show in the history of television came to its emotional conclusion, culminating six years of a cultural phenomenon that captured the hearts and minds of its viewers like no other television program ever has. Lost is famous for its mind-boggling mysteries that would give you clues in one episode but reveal the true purpose entire seasons down the line. As much as viewers loved to debate the meaning of numbers, smoke monsters, and 1970s science expeditions, the true heart of the show has and always will be its beloved characters. Those of us that have been with Lost since the very beginning (myself included) have grown attached to these characters as we have gone with them on the roller coaster of a ride that was the show's six seasons. Now that the show is over, the after taste is bittersweet; on one hand, the show ended on a high note and firmly resolved these characters' lengthy journeys, but on the other, we must lament that that journey has in fact reached its ultimate conclusion. The impact of the show goes far beyond its mysteries and characters.

When Lost first aired in 2004, the scripted TV show was very much an endangered species in the face of cheaper-to-produce, higher-margined reality TV shows. Lost changed all that by becoming one of the most critically, culturally, and (most important to ABC) financially succesful television projects ever, even with the enormous costs it took to produce it (the pilot was the most expensive pilot ever to be filmed with the bill clocking in at over 2 million dollars according the finale pre-show). As a writer, I must also acknowledge how Lost managed to create an entire cast of thoroughly engaging, unique, and interesting characters, and over the course of its six seasons the show built an incredible relationship between these characters and its viewers. In the age of big budget visual effects and action, Lost brought storytelling back to its roots by reminding us just how powerful strong characters can be, a fact that is especially magnified in the long-term medium in which Lost was delivered.

I had originally planned to write a full-on impressions piece on the series finale and the culmination of the series as a whole, but the Lost community has proved spectacular as always, and I will instead be tipping my hat to some of the great pieces that have come out since we reached the end.

G4 writer Patrick Klepek penned a wonderful reaction to the Lost finale, nailing just about every point I would have said in my own:

College Humor ran a humorous video tallying up all the unanswered mysteries that the show never answered. Definitely a must-watch for long time Losties:

Finally, a fan cut together the opening sequence of the pilot episode with the closing sequence of the finale. It's amazing to see how the show began and ended at the same time, and especially how the two sequences mirrored each other:

Thursday, May 13, 2010

A New Strategy for Downloadable Content (DLC)



The advent of this generation of consoles' internet-centric experiences and dedicated storefronts (Xbox Live Marketplace, PlayStation Store, Wii Shop Channel) means that DLC has reached unprecedented levels of importance. This is especially true when combined with the skyrocketing costs of game development; DLC represents a relatively inexpensive way to make more money off the same consumers. In fact, most games these days pre-plan to have DLC available post-launch. Years ago, Microsoft publicly advised developers that DLC should be made available between 30-90 days of launch or otherwise warned that the gamers will have moved on. In addition, many developers have been quoted as saying that it is virtually unfeasable to take DLC from idea to release entirely post-launch and that it needs to be worked on prior to a game's release. Given all these things, developers have a pretty good, if not absolute, idea of what they have in the pipeline. Problem is, developers and publishers never share those plans aside from the vague "we are working on it". I claim that this practice is detrimental to the goal, progress, and idea behind making and releasing DLC.

It is true (as Microsoft claimed) that gamers consume their media at such a fast pace that if DLC is not released close enough to launch, they (as a whole) will either trade in games or lose interest. Furthermore, some gamers don't even know about DLC to begin, either from lack of internet access or--the more relative problem--they don't take the time to search for it. The latter is the kind of gamer that turns on their system and jumps straight into the experience, skipping over the ads on the dashboard (for lower-profile games there may be a lack of dashboard advertisement altogether). If publishers want gamers to buy DLC, those gamers need to know about DLC, and shouldn't have to spend time surfing the dashboard or sites like IGN just to dig up scant details about it. The reality is that the majority of gamers just aren't interested enough to put in that much effort.

And so, we get to the crux of my argument: DLC plans need full disclosure. There is no reason for publishers to hide their DLC plans like they do with the details of the retail box copy of the game. Instead, there should be a pamphlet on top of the instruction booklet that tells you exactly what's coming, where to get it (e.g. XBL Marketplace), and most importantly, when. This strategy could even be extended to the normal marketing of the retail product. The idea here is that it puts the idea of DLC in the head of the consumer from the get-go, allowing them to build up excitement, especially if there's a solid date to look forward to. Ideally, this "DLC Schedule" should even lend a small description (if not more) of the to-be-released content so that gamers actually have some substance to get excited about. If gamers know exactly what's coming, they can get excited for it, plan to buy it, and spread the word about it.

Now, DLC is, of course, unfinished work--this is the reason why it takes so long to release in the first place. But going back to the beginning of this article, by the time a retail game goes gold and is prepping for release, the developers already know what DLC they are making. If they can't lock down a date, at least give a specific window (e.g. March or Early March instead of March 3). Give out the details that are already locked down--and that articles currently reveal a week or two before release. The more lead time, the more time for anticipation to boil. And should something go amiss, DLC could always be delayed, canceled, added to the schedule, whatever; these things happen with retail games all the time.

As it stands right now, publishers are simply not doing a good enough job creating awareness for their DLC pipeline, and therein lies the biggest barrier to increased penetration of DLC. If people don't know about it, how can they know that they want to buy it?

Wednesday, January 20, 2010

Editorial: Assassin's Creed II Ending Analysis


MASSIVE SPOILER WARNING-This editorial discusses critical plot events in Assassin's Creed II and should not be read if you are going to play through the game.

So you've finished Assassin's Creed II, all the way through DNA Sequence 14 and the hideout escape and final chatter scene. You've found and solved all 20 of Subject 16's glyph puzzles. You've seen The Truth and had Minerva talk to you through a 500 year-old ancestor. How do you make sense of all the craziness involved in the activities listed above?

First of all, let's just say that it's really creepy when Minerva breaks the fourth wall in some sense and looks directly at you when she talks...although you don't notice it the first time. Let's start with the obvious: Minerva could see into the future and know that Desmond would be getting this message in 2012 by visiting Ezio's genetic memory, so Ezio's overall role is just as a relay to Desmond (for the first time noted as extra special compared to the other 16 test subjects). Minerva herself is of a humanoid race that predated human beings on Earth as a highly advanced civilization. Minerva's people--interpreted by humans' later descendents as gods--created human beings in their image as somewhat of a peon class. Through some of your research in obtaining the glyphs, you learn that scientists have discovered a neurotransmitter in the human brain that seems to serve no purpose, but in reality that neurotransmitter is activated by the Apple of Eden, which allows humans to be controlled. You also learn that the Templars (presumably) plant evidence of the "missing link" in human evolution so as to distract people from the Truth that they were created. (The glyph puzzles also basically say that every great historical figure used a Piece of Eden to achieve their success).

Anyway, back to Minerva, she says that a war eventually breaks out between her kind and the humans (more on that in a bit). While both species were consumed by war, a cataclysmic event involving the sun ravaged the Earth (flipped the polar magnetic fields). Despite that Minerva's kind were more advanced, the humans had sheer numbers and were able to survive while only few of Minerva's kind remained. The few remaining and the fact that they had created humans further holds up that this is how humans developed the concept of gods and referred to this species as such. Anyway, as Minerva's kind died out they built vaults (housed in great temples) to relay this message to the future so as to avoid humans from the same fate when it happened again. After the very end where you can here the Assassins talking to each other in the truck, you learn the bit about the cataclysmic solar event. Desmond's story also takes place in 2012, a year that is famously associated with cataclysmic disasters and polar reversal is one of the leading theories.

So that's where Assassin's Creed III will be going, but what about the rest of this crazy backstory? Well The Truth video from the glyphs is a memory of Subject 16's, mean that he is a direct descended of either Adam and/or Eve from the video (by the way, the Adam/Eve names justify the Biblical story of human creation, especially the Forbidden Fruit). Since Adam and Even were basically the first Assassins, that means that all Assassins are of their descent. What is likely is that Adam and Eve may be a hybrid of Minerva's species and our own, or that one of them helped a human escape and steal the fruit. In any case, their descendents--the Assassins line--are separated here from the rest of the human genome. Anyway, Adam and Eve steal the Apple from Eden in this sequence and likely start the human rebellion that Minerva spoke of. The Templars must be a sect that has knowledge of the Apple and the Pieces of Eden and want to use that power to indoctrinate the world (perhaps related to the Assassins line but went bad). Minerva does say that Desmond will have to "fight against the Cross" and the Templar's symbol is a cross. The rest of the world has forgotten all about the Pieces, referring to them only in mythical stories (such as Adam and Eve and the Forbidden Fruit).

So there you have it, what are your theories? This is crazier than even National Treasure or a Dan Brown novel. One of the few examples of incredible video game narrative.

Thursday, July 30, 2009

Editorial: The Tainted Hall of Famer?

Today, The New York Times reported that its sources have confirmed that star sluggers Manny Ramirez and David Ortiz were among the list of 104 players that tested positive for performance-enhancing drugs back in 2003. The test--which was supposed to be anonymous--was conducted as a survey in order to determine whether an official drug testing policy would be implemented (which it was). Other famous names reportedly on the list include Barry Bonds, Sammy Sosa, and Alex Rodriguez. The leaks themselves are controversial enough, especially from a legal perspective, but amongst the rampant questions surrounding Ortiz today, much more significant news may have been overlooked.

In an article by ESPN's Pedro Gomez, former MLB slugger (and self-admitted steroid user) Jose Canseco says that he was not at all shocked by today's news and that he had already known about both Manny's and Ortiz's places on "The List". He goes on to say that "Major League Baseball is going to have a big, big problem on their hands when they find out they have a Hall of Famer who's used." Canseco declined to elaborate on who that Hall of Famer might be but does allude that the player in question was on that 2003 list as well; Pedro Gomez, however, quickly moves on from the subject to discuss Canseco's warning of a potential class-action lawsuit. But it is in fact the prospect that a steroid user may have already gotten into the Hall that resonantes most, especially when fierce debate surrounds other Hall of Fame-caliber players that have been alleged to use the drugs, such as Mark McGwire.

So who is this mystery man? Well, a quick run-through of Hall of Fame members yields only four players that even played during the "Steroid Era" (1990s or later): Rich Gossage, Cal Ripken Jr., Tony Gwynn, and Rickey Henderson. Of those four players, Henderson is the only one to have been active in 2003 when the survey testing was conducted. In addition, the timing of Canseco's revelation is important as well, since he has clearly liked getting attention on the issue yet only now decides to share that there is a Hall of Famer that used these drugs--perhaps not at all coincidental when considering that it was only this past weekend that Henderson was inducted. When you also consider that the end of his career was characterized by trying to desperately hang on to a place in the big leagues, its not a stretch to suggest he might have given himself some help in order to compete with younger talent--many of whom were allegedly using PEDs themselves. While its entirely possible either Ripken or Gwynn is the alleged user (or no Hall member at all), signs definitely seem to be pointing toward Rickey Henderson.

Should it later be revealed Henderson in fact used performance-enhancing drugs, Major League Baseball and the National Baseball Hall of Fame sure have an enormous mess on their hands. Would they consider ejecting someone from the Hall? What about all the players coming up on the ballots in future years that were also tied to steroid use? Do they go ahead and implement the dreaded Asterisk? If nothing else, the future should be interesting. It is worth noting that Canseco has a pretty good track record with his accusations thusfar, so let's put the official question out there for the better educated to ponder: did Rickey Henderson use performance-enhancing drugs?

Tuesday, July 21, 2009

Editorial: Football--the kind we call 'Soccer'--in the United States

Soccer is such a divisive subject among the American sporting public. Outside of its establishment as a common childhood sporting activity and the heralded American "soccer mom", the sport is often seen as the black sheep in the field of much more popular sports such as baseball, basketball, and our own American football (just plain "football" to most folk). Despite its miniscule albeit growing popularity, outside of our country (and our neighbors to the north that prefer competitions on ice), it is the sport of choice for the citizens in world. On ESPN this morning, Kenny Maine proclaimed that the sport had 2 billion fans worldwide, an amazing figure when weighed against the roughly 6.7 billion people in the world (that's roughly 30% of every person on the face of the earth, including rural and indigineous peoples). But why is the story so different in America?

Many American sports fans complain about the low scores and the extreme difficulty of scoring hindering the excitement of the game. In fact, similar complaints forced the National Hockey League to adopt some rule changes designed to facilitate slightly higher-scoring games. To latch onto this idea dilutes much of the appeal the sport has to offer. See--brace yourselves for this people--scoring isn't the only thing thats important in the game of soccer (sorry footy devotees, but I'm sticking to the American name for the duration of this article). It's the ongoing struggle that leads up to every goal where much of the excitement really is, which further infuses inevitable goals with an explosive energy that finally boils over and blows its lid, Mt. Saint Helens-style. And unlike other sports, the action is almost completely uninterrupted during its two halves, raising tensions much higher, similar to intentionally long shots in some movies. In addition, we as a culture have been bread and weaned on things that provide us instant gratification, so the long battle for those precious goals is something that has some built-in sociological barriers to get over. And to those who claim that soccer is not a contact sport, I urge you to go play against some people who actually know what they are doing; soccer, even at the sub-professional level, features a level of physicality that can be likened to--if not even more so than--a hard-nosed game of hoops.

Perhaps the physicality takes a bad rap in America for the abundance of players that seem to intetionally flop (socceroos call it "diving") in order to try to draw a foul call, laying on the ground in supposed "agony" and then gingerly running about moments later. For those who get a bad taste from this common action in the European game, I urge you to look closer and benefit our fellow Americans at the same time; by that I mean, go watch a game of MLS soccer. While the competition of the game itself is a much lower level than the best that Europe has to offer, the league is growing very quickly and watching it one can note that players--particularly American-born ones-- seem very much less bothered by intense physical contact. Sure people get hit and fall all the time, but instead of pouting on the ground they spring up and fight back into the play. My theory on this is that the trait is attributable to the encouragement of toughness in other sports and in our culture overall, so it would be a sign of weakness to lay there. Major League Soccer is growing very fast and has a lot to offer, but I digress as to what makes the game and our league so different when compared to the rest of the world.

One very interesting theory is that the game constitutes a lot of cultural undertones that are perhaps underappreciated in any debate regarding the popularity of the game in America. I first heard this theory whilst waiting in the drive-thru line at In N' Out Burger listening to ESPN Radio. I unfortunately do not remember the specific show or host, but he was discussing his own theories and taking calls on the importance of culture in the acceptance of culture. He started by pointing out that soccer has never been popular in our country, and perhaps its at its highest point now than ever before. Why is this when the entire rest of the world embraced it so much? Well for one, its very cheap, necessitating only a ball, flat ground, and any random materials to concoct make-shift goals out of. Compare this to basketball where at least a ball, reasonably hard ground, and a specially-designed metal hoop are needed; baseball where a myriad of equipment are required, from leather gloves for every single person to bats, hats, bases, etc.; and football where pads, helmets, many players, and much planning are required. The low cost is an understated benefit to attracting players in all classes, races, and regions of the world, including the Third World.

In regards to its lack of popularity in America specifically, the same radio host pointed out many other interesting facts in history. Soccer really developed and emerged into its modern form in the late 1800s and early 1900s, particularly in Europe and its colonies. Now, at this point in history, the United States is still a growing young nation and a far cry from its future superpower status, trying to establish its own identity. At the same time, many Europeans were immigrating to America in hopes of starting an entirely new life. A combination of these Europeans wanting to leaving everything in their old lives behind--including the decidedly European game of soccer--and Americans wanting something that was decidedly their own and not foreigh, which they found in baseball and boxing. These factors have from the beginning have disposed soccer in the awkward position of being somewhat un-American.

However, discussion regarding soccer has grown ever louder in recent years, and many are often debating how long--or if at all--soccer will be able to reach a popularity level in our country that compares to its attention in the rest of the world, or at least enough to be considered one of our "major" sports (right now generally considered to be football (NFL), baseball (MLB), basketball (NBA), and hockey (NHL)). The U.S. men's national team sparked much of the most recent discussion when they toppled world #1 Spain in the Confederation's Cup semifinal and jumping to a 2-0 lead over perennial power Brazil in the following final match, albeit eventually conceding three scores in the second half to lose.

I argue, however, that the recent climb of soccer actually started much sooner--in 1994 to be exact. You see, in 1994, the United States played host to the grandest sporting competition in the entire world, the FIFA World Cup. Part of the US Soccer Federation's deal with FIFA in getting the bid to host the cup was a promise to establish a major league in the country for the sport. The following year, Major League Soccer kicked off its inaugural season, and after a few problems early on, the league is now growing astronomically, adding expansion teams on an annual basis (and two new franchises next year!) and attracting more and more world-class players. In 1999, the United States women's national team won the whole shebang in the FIFA Women's World Cup, for the first time giving the United States a title as best in the world in soccer--a distinction that some sports analysts claim the men's team needs in order to really establish any mass-market interest. In the 2002 World Cup in Korea/Japan, the U.S. men's team achieved their best World Cup finish in the nation's history, not only making it out of the group stage but also winning in the Round of 16 to reach the quarterfinal, losing to eventual runners-up Germany. The 2006 outing was very flat, but after the 2009 Confedertaion's Cup performance, the United States has established itself as a team on the rise. The future looks particularly bright when you consider how young the national team is right now, anchored by the 24 year-old Landon Donovan and budding young potential superstars like Jozy Altidore and Michael Bradley. The U.S. is pushing to host another World Cup in either 2018 or 2022, which would likely boost the sport's profile even higher. As for myself, I first truly got hooked on the sport during Germany's 2006 World Cup.

So, I urge you sports fans of America, give soccer a look despite all your predjudices against it. We already watch the World Cup every four years and when our national team does well. Yes, it's hard to follow the best club teams when they play all the way across the Atlantic Ocean, but go watch your local MLS team and get a vibe for the high fan energy at those games. 2 billion people can't be wrong. The MLS is on the rise and truthfully, its only a matter of time before the league is one of the best in the world. It might take twenty years, but America has shown just how willing it is to slavishly follow sports, and multiple ones at that.