I've got a problem. I've got the following setup:
- Nvidia Geforce GTX 560 Ti with 2xDVI (dual) and 1xMini-HDMI
- Asus VW226 (22", 1680×1050)
- Acer V233H A (23", 1920×1080)
I use the Asus as my primary display since it has a lower pixel density (everything is larger thus better readable). Previously I had a 19" LCD as my secondary display through a VGA cable (no DVI on it)
The new screen (well it's second-hand) has both VGA and DVI. Because I didn't have a spare DVI cable I connected it through VGA and after some configuration (it didn't detect the resolution correct and only gave it 4:3 resolutions) it works fine.
I now have gotten a new DVI cable and want to connect it through DVI, but that doesn't work: it simply doesn't detect a second monitor in both the Windows and Nvidia configuration screen. I've tried the "rigorous detection" option in the Nvidia control panel.
I've swapped the ports and then the Acer worked fine, but the Asus didn't: same symptoms. So the displays work with DVI on the first port, but only on VGA on the second.
As I don't have a mini-HDMI -> DVI converter I can't try that port, but does anyany have an idea how I could fix this?
"BOTH monitors work on DVI when coupled to the first port, neither when to the second. The second port does pass VGA through fine"
When using a DVI->VGA adapter, you are only using a fraction of the pin-connectors on the DVI port.
The "C" pins are used for analog (VGA).
If you've tried two different monitors, and two different DVI cables, and it won't work in DVI mode on that one port, then that one port probably has damage to the DVI pins (which aren't used for analog/VGA conversion).
Sounds like you need a new video card.