I have two VGA connectors open on my monitor. Can I keep one DVI cable plugged in to my computer (as I only have one slot) and then plug the VGA cable into both of them? Will Windows 7 detect two monitors? (My computer doesn't have any HDMI ports)
Daisy chain two monitors to use for Windows 7
multiple-monitorsvga
Related Solutions
According to the user manual for that card (which is downloadable via the product page on MSI.com, if you look at the diagram of the card on page 2, it states under an asterisk:
The DVI-I port and D-Sub port of this card do not support dual monitor.
DSUB refers to the VGA connector (which is part of the DSUB connector family).
So it seems that you might be out of luck trying to get dual monitors to work with this card. It seems to just simply not support it.
By the way, CCC might not allow you do manage the display settings for this card because it's a very old card, and recent versions of CCC may not be compatible with its drivers. According to the product page, it's powered by the "ATIĀ® RADEONĀ® X300SE GPU". That card uses the RV370 GPU, which, according to Wikipedia, was released in 2005. That's 8 years ago, which is ancient in computer timeframes.
I think that you used "I attached one adapter to the monitor, and the other one to the card's DVI-D port, but it doesn't work. My Windows 7 doesn't recognise it. The NVIDIA Control panel neither. The screen stays blank too." this on the second monitor i.e. Dell 1801FP monitor. and "When I unplug the older Asus, and simply use the VGA cables and ports, it works, but I need both monitors." then this means that your VGA cable is working properly. Then if your VGA cable is working properly then try to first use that cable on the VGA port of second screen. If it's working then maybe your DVI-D adapters are faulty (I assume that Dell monitor is a new one so the DVI ports should not be the fault). And also "My Windows 7 doesn't recognise it. The NVIDIA Control panel neither" I think that you also got an IGD (nowadays mostly all of the computers have an integrated graphics card built-in) so also check whether to change the input when you plug-in the DVI-ports and select your default graphics card in the BIOS to be another one(external one) and then try to see if it works. Also if this doesn't work then first check if your DVI-ports from the graphics card are alright and you are plugging in correctly.
Also if you windows cannot recognise it maybe because you haven't installed the graphics driver of the Card
Best Answer
VGA does not allow any kind of daisy chaining at all, so doing what you are proposing is not possible as described.
It is more likely that the dual VGA connectors are meant to be used as two separate inputs. In that respect you can connect both to the same system, and switch between inputs on the monitor to use the different parts of the larger screen as used by Windows (like you normally would with a dual-monitor setup, but with only one physical monitor).
VGA and DVI are not compatible, so you need some kind of active adapter to go from one to the other. To start with, VGA is analog, whereas DVI (like HDMI) is digital. DVI and HDMI are, however, electrically compatible but offer different feature sets.
Remember that VGA was introduced with the IBM PS/2 which was released to market in early 1987. For comparison, at that time, MS-DOS 3.3 was current, and Windows 2.0 wouldn't be released for another half year, and the text-mode brand-new OS/2 1.0 came out at the end of the year to take advantage of the powerful hardware; the most powerful model of the initial set (PS/2 Model 80, IBM model number 8580) sported an 80386 DX CPU.