Search the Community
Showing results for tags 'DVI'.
Found 2 results
So I recently got a new monitor (BenQ EW2440L, 24", 1920x1080) and it came with a VGA cable in the box. The monitor itself supports D-sub and HDMI-input, so I was wondering if it's worth it to switch from VGA to DVI/HDMI. From what I've read a DVI or HDMI cable should provide better quality than VGA, but will the difference be noticeable? Also, if I do make the switch from analog to digital, would it be better to get a HDMI-to-DVI cable or a regular HDMI cable? Any input is appreciated
Hi! What I'm trying to do: Setup 5 separeted monitors on two XFX Radeon HD 6950. Problem: I'm only able to activate 4 monitors (DVI ports) on 5. The last one, with an active mini-DP to DVI-D single link adapter, doesn't work. I always received the same error message telling me that the link didn't work. What I tried to do: 1- Tried with and without crossfire: Didn't work. 2- Check if the updates were done: Everything look good. 3- Tried to set-up only three monitors on one graphic card. 2 DVI + 1 adaptator. Didn't work. 4- Tried many combinaisons 2DVI/1HDMI, 1DVI/1HDMI/1adaptator and on both graphic cards. Didn't work. 5- Tried again many combinaison with 4 different monitors. Didn't work. 6- Tried to configure eyefinity, even if I don't want it because I want 5 seperated monitors. Didn't work. 7- Tried a passive mini-DP to DVI. The mini-DP worked but I wasn't able to activated more than one other monitor. Windows 7 The adaptor: http://www.canadacomputers.com/product_info.php?cPath=9...) Monitors: 1x BENQ BL2400, 2x BENQ RL2455HM, 1x ACER G235HL and 1x LG FLATRON W2353V. Any clues/solutions? Thanks