Quote Originally Posted by TonyTheTiger View Post

Besides, as I said, video games are the result of advancing PC hardware, not the cause. And that's not necessarily a bad thing. As PC hardware improves and becomes more cost effective, video games take advantage of it. The scenario you're presenting is, if not completely untrue, certainly the exception rather than the rule. I think you're letting your subjectivity take hold of your arguments.
You do realize that video game systems were around for almost 10 years before most households owned their first home computers, right? In fact, until about 1977, you couldn't even buy a practical computer for your home and even then, it was outrageously priced. When I got my first Apple II in 1980, I was literally the only person in my elementary school class who had one for about two years. Video Games are the result of people looking for practical consumer applications for electronics technology and they go back to the earliest college mainframes and to the experiments Ralph Baer was doing with common television components. They are not and were not the result of advances in PC hardware.

You might want to have a conversation with Steve Wozniak or some of the other folks who drove the home computer from a hobbyist tool to a mainstream consumer item and ask them what they think has driven computer specs forward. Here's a clue, it wasn't word processing or business applications. Even Andy Grove from Intel was interviewed a number of times and credited or rather lamented the fact that games had shortened the estimated life cycle of his various processor families by literally years. Video games have always been the driver of computer advances and without them, there would be almost no need for most computer owners to upgrade more than every decade or so.