Login






Your Ad Here
Respected Sites

Next-Gen Gaming - A New Scene Or An Old One Rehashed?


Labeled With  playstation 3 xbox360 wii
Written by Cellfish on Thursday, January 25 2007

In the early 1980’s, much like today, a there was a console war. Consumers had to choose between the veteran Atari 2600 with its gigantic library of games and 10 million user base, the first 16-bit video game console in the Mattel Intellivision with its superior graphics and the ColecoVision with its perfect translations of arcade games. It was 1982, games were selling like hotcakes, and it seemed like there was no end to the money that was rolling in. However, in 1983, it all came crashing down. Atari, with its aforementioned library of software titles learned that quality, not quantity, will drive your business in the long run. Claiming that you have the largest selection of games is only valid if you can also claim there is a good ratio of quality to crap. Quality had never been a part of Atari’s business plan and it sealed its fate as a result. Consequently, the North American video market confirmed that the video game craze of the late 70’s and the early 80’s was nothing more than a fad.

How has the industry changed since then? Well, for starters, when Nintendo revived the American video game industry a mere two years later, it also implemented something simple to ensure that Atari’s mistakes would never be repeated. Essentially, developers would have to have their titles filtered by Nintendo. If they met Nintendo’s strict quality standards, they would be allowed to release their game on the console and amass a fortune. If not, they would be dumped and would have to look to other console developers to release their inferior product. As a result of this, Nintendo has garnered a reputation for quality that follows them to this day. In most gamers’ minds, if Nintendo created it, it’s a classic. This is significant, because licensing is no longer optional to console developers, it is essential. Nintendo, Sony and Microsoft all force developers to meet standards on everything from a packaged game to a downloadable budget title.




However, some developers still manage to slip some true garbage through these quality guidelines. The result of which is that some developers can inflate the number of titles available on their console, all the while neglecting to mention that a great percentage of that number is either unplayable, uninteresting or instantly forgettable. I do not want to point the finger at any particular corporation, but all gamers have a general idea of which company this might represent. Far too often, gamers will defend their console of choice by citing how many games are available for it. One has 500 titles available, the other may have 6000. To fanboys, these sheer numbers would signify that the console with 6000 tiles available is instantly providing a greater variety to its gamers. To the more experienced, quality, not quantity, will sell the greater amount of units.

This brings me to a question: was it the quantity of titles available for the Playstation 2 that allowed it to sell 100 million units or was it the ratio of quality titles to terrible ones? If quality is what sold the consoles, what is the ratio of good games to bad ones on the platform? Whenever a company is given the epic task of selling a product, it wants to make sure that its products and anything associated with it is of the highest quality. Sony might have done that, but some opinions may differ.

1 | 2 | Next Page >>

Related Articles:
 Top 10 Gaming Gadgets & Accessories Of 2007
 Eight Jobs Ken Kutaragi May Enjoy In Retirement
 Can Microtransactions Ruin Console Gaming?
 Playstation 3 HDD Replacement Mini-Guide
 Untold Legends: Dark Kingdom Playstation 3 Q&A
 Top 10 Gaming Gadgets & Accessories Of 2007


© 2017 GamersReports.com. All Rights Reserved. Privacy Policy