In my desktop I have a basic graphics card, a 1GB ATI Radeon HD 5450. It's served me well through the years, although I am looking at having to replace it as it can't run games as well as it should.
Take Battlefield 2142 and Crysis. Two games released within a year of each other (2006, 2007). It runs the first (maybe not on full graphics) and is playable with decent frame rates (40+). The system requirements for Battlefield 2142 are 1.7Ghz, 512MB RAM and 128MB Graphics which my desktop (Quad Core 2.8Ghz, 4GB RAM, 1GB Graphics) matches fine. This is also enough for Crysis (2.8Ghz CPU, 1.5GB RAM, 256MB graphics). Also other games like Just Cause 2 and Battlefield Bad Company 2 "should" work, but they are so slow and juddery, and I barely scrape 10 fps out of any of them. The intro scenes for both Bad Company 2 and Just Cause 2 play slowly with the audio/video out of sync.
I realize there's a lot I don't know about graphics cards, but I also realize now that it's not just a case of having a 1GB or 2GB graphics card – what are the things that "matter" in a graphics card: the core clock, the memory, the memory clock?
Best Answer
There are a whole bunch of factors that determine how good a GPU is. Let's take a look at some of them:
You'll notice that I've not mentioned amount of memory anywhere above - that's because unless under extreme cases, the memory size simply doesn't matter. Unlike your RAM where more memory often results in faster performance - this isn't true because the memory in GPU is used as a frame buffer - for texture storage. As a result having 4GB on a low end GPU is worthless because the GPU cannot perform well enough to process those textures.
Other software/driver switches which cause performance drops/improvements: