I am still not clear what display you are using and how it is connected.
A DVI to VGA adapter merely runs the pins through that exist in DVI to carry analog video. To be a "VGA" signal, the display needs to be inputting via a VGA connection. As a for instance, Dell 24" displays let you choose from a variety of ports along the bottom. But you have to use the OSD to choose one. If I choose "VGA" and connect a DVI to VGA adapter on my card then run the VGA cable to the display, I am using VGA.
But if I put a DVI to VGA adapter on each end of the VGA cable, the only signal present will be VGA, there will be no DVI. Very few displays ever could pull the VGA from a DVI plug, so for most instances, this would not work.
If you use a DVI cable and the plug going into the display is DVI, then you are using the DVI part of signal.
If you use a DP or HDMI to DVI adapter, it is only passing through the SINGLE LINK part of DVI, just like the DVI was passing through the VGA. They are separate signals nested inside same plug. But using a DP or HDMI adapter only gives you single link, which means 1920x1200 is as big a display as it will run. Connecting to a 30" at 2560x1600 will either give you 1280x800 or a black screen.
Paying attention to which adapter, connecter, and cable you are using is important. It isn't like a water hose where any fitting will work as long as water comes out the end.