Your personal information you provide will be transfered and stored as encrypted data.
You have the ability to update and remove your personal information.
You consent to our cookies if you continue to use this website.
Allow cookies for
Necessary Cookies Necessary Cookies cannot be unchecked, because they are necessary for our website to function properly. They store your language, currency, shopping cart and login credentials.
Analytics Cookies We use google.com analytics and bing.com to monitor site usage and page statistics to help us improve our website. You may turn this on or off using the tick boxes above.
Marketing Cookies Marketing Cookies do track personal data. Google and Bing monitor your page views and purchases for use in advertising and re-marketing on other websites. You may turn this on or off using the tick boxes above.
Social Cookies These 3rd Party Cookies do track personal data. This allows Facebook, Twitter and Pinterest integration. eg. shows the Facebook 'LIKE' button. They will however be able to view what you do on our website. You may turn this on or off using the tick boxes above.
Posted: Howdy, I got a nice 19" LCD monitor for my birthday. It comes with DVI and VGA connectors. The instructions are very vague and show connecting both at the same time. My video card has both ports so I connected both. The monitor would show the startup then go to no signal. I disconnected the DVI from the computer and it works fine with just the VGA. My question is am I not using my setup to it's best capacity? Should the DVI be my conntection for better resolution? I think I would need to change some settings to use the DVI, but I don't want to fix something that's not really broken.
For the same reason we went from using analog cassettes to digital DVDs, analog TV to digital TV etc, DVI is far superior, and will give you miles better performance than VGA. There will be settings in your video card setup (control panel->display properties) if your card has both outputs.
Posted: exactly, it's converted from digital to analog and then back again. Everytime you do something crazy like that you loose quality. That's why digital camcorders these days copy off footage by firewire (digital) not phono connectors (analog).
This all said, depending on the length and quality of the VGA cable, you may not notice a difference on LCDs 17" and under
Posted: Ok I'm still fiddling with it. The hard part is that I have it pluged in as VGA so I can see what I'm doing. I also have the DVI plugged in. When I switch it to digital (through the video card in display properties) It displays "no signal" even if I unplug the analog part. When I plug it back in the screen pops back up and it defaulst back to analog VGA. We tried having it self install the correct monitor drivers (no software with the monitor) and it seemend to work, but I still can not unplug the VGA and have a signal. Also, the video card tad in display properties is gone. I thought I had it until I realized I had seleted a dual monitor option because it's plugged in twice. My future wife keeps telling me to leave it alone it's working. So I can't tell if I'm digital or not, but It's only one monitor now.
In case it helps my video card is an ASUS 128meg with VGA, DVI, and the S-TV out thingy. The computer has a gig of ram and about a 1.3 gig proccessor.
Posted: 1 gig of ram but a 1.3? that's just silly. Anyway, windows xp's craziness at configuring tv out/second monitor is part of the fun . If you've got a second VGA monitor lying around, it really helps to plug them both in. There's a button with "identify" (or something like it) written on it in display properties, which shows up a big 1 on the first monitor and a big 2 on the second. It isn't that straight forward, but there's probably tutorials if you google for them. Failing that check on your video card/monitor's website/forums