Thursday, April 8, 2010
Once upon a time, people didn't buy videogames. They went to an arcade, and bought playtime in twenty-five cent increments. How much time a quarter bought was completely dependent on the skill of the player. An unskilled player would find their progress barred quickly, and need to supply more quarters. A skilled player could proceed much longer, and was thus rewarded for the time, effort, and money poured into gaining their skill. The public nature of the arcade also rewarded the skilled player with the opportunity to show off in front of others. This provided the unskilled players with something to aspire to and suggested that it would be worthwhile to keep feeding the machines with quarters, so that they too might someday bask in similar glory. So it made a great deal of financial sense for arcade games to feature limited lives with more available for purchase.
Eventually videogames moved from the arcade to the living room. Here it was much harder for a player to compare themselves to other local players, and there was no need to keep the quarters flowing since games were purchased outright. The reasons to limit lives had vanished, and barring the progress of unskilled players now served mainly to disrupt the experience and prevent those players from seeing all the content of the game for which they had already paid. This limited the games' potential audience - why buy a game you can't expect to make it through? Financially, it made no sense whatsoever for games played in the home to feature limited lives.
But that didn't stop them from doing it anyway. From the original Super Mario Brothers on the NES all the way up to New Super Mario Brothers on the Wii, mainstream games have still not completely shaken off the limited lives trend. Why?