Quote Originally Posted by MarineOne View Post
I will never "future proof" a system simply because it's not financially reasonable to even try, so I agree with you there. I saw potential with the Aegis accelerator card, but never bought one, and it was a great idea but no one, not even game dev's, bought into it.

However, there are some games (Stalker: Clear Sky, Crysis, Far Cry 2, CoD4, CoD:WaW) that play or played horribly with the video cards that were top of the line when the game was released, even with dual cards. Crysis is a great example of this; it plays great on my single GTX 285, but game play sucked on dual 8800 GTX's. Even the Stalker pre-quil plays bad on a nVidia 9800, but I haven't tested it yet on my rig so I don't know how it will play.

I think the games coming out will continue to push the hardware to evolve more than they did a few years ago, like when the original Ghost Recon was released in 2001 and UBISOFT used the very same game engine for their release of Sum of All Fears game for the movie when it came out.
I def. don't consider crysis the benchmark everyone else did when it came out. It played like crap on my 8800 and my 9800 was playable at a lower res. I think that had way more to do with the coding than the cards. And don't get me wrong the graphics were good but far from amazing to the degree that was hyped. The other games mentioned played on my 9800 without a problem although I think the original stalker was kind of choppy when maxed.