Display Connectors DL-DVI-I/SL-DVI-D
THATS your problem
There's three types of DVI connections (well 5, in theory but bear with me).
DVI-D is purely Digital, DVI-A is standard bog standard VGA with a different pin out, and DVI-I is a mix of both. For DVI-I and D you also have single link and dual link, which is irrelevant to the question - it determines maximum resolution.
DVI-I includes analog pins. DVI-D dosen't. As such a passive DVI->VGA adaptor (which is what i mostly see, and is a DVI-A -> VGA converter which you mostly connect to a DVI-I connector) won't work on DVI-D since you're connecting to pins that aren't there
Annoyingly, DVI-D is supposed to have a different pin out just so that you couldn't put a passive adaptor in and find that it dosen't work. The wikipedia article also goes in depth into all this.
In theory you should just use DVI if possible for the second monitor or find an active converter that converts DVI-D to VGA, rather than passes through DVI-A connections to a VGA connector.
Your graphics card should support being able to drive both a VGA and a DVI monitor at the same time.
ATI have a (slightly rubbish) guide on the basic steps to get it to work with the two monitors ATI Multimonitor Guide. Basically the show you how to run their setup wizard.
In the ATI Catalyst Control Centre you should see something similar to this, though I've no idea how old this image is what you are looking for should have a similar name:

Basically if your second monitor is being detected then you can enable it.
If on the other hand the monitor is not able to be detected you may need to check your connectors and cables to make sure they are all well seated. Possibly try booting the machine with only the VGA monitor connected to see if it works at all.
-=EDIT=-
Having re-read your question I think I understand your problem a little better. You are using the VGA output on your graphics card, and using the analogue output from the DVI port via a DVI-VGA adapter.
I'm still a little surprised that this did not work but it is entirely possible that the graphics card only has one set of hardware for converting the digital video signal to analogue VGA and so will only work with one VGA monitor at a time. This would mean you can use either the VGA or the DVI-VGA but not both.
I have a graphics card here that has two DVI ports, but only one of them will accept a DVI-VGA converter, the holes for VGA are missing so the connector cannot be plugged in so I guess this card would have the same problem you are experiencing, only one analogue monitor output.
Another thing to check though is if your graphics card has a DisplayPort output. If so you can get adapters to convert that to a VGA output. I have just recently had reason to buy a Displayport to VGA adapter which worked a charm for driving an old VGA monitor and it was cheap at £5 all in.
It seems that digital DVI to VGA converters are expensive (£100+) while digital Displayport to VGA converters are much cheaper at <£20.
Best Answer
This is because you are plugging your second monitor into the onboard graphics and not into your video card.
On the back of your computer where you have your DVI connected is the graphics card (green square). You have your second monitor plugged into the motherboard (red square).
In order to plug in your second monitor as VGA, you will need a second DVI port (most graphics cards have two, it should be next to the DVI you are using) and a VGA to DVI adapter.
A second solution is to get a DVI Y splitter which will take two DVI inputs and turn them into one DMS-59 input which is supported by most modern video cards.