cancel
Showing results for 
Search instead for 
Did you mean: 

Any performance difference using HDMI vs. DVI adapter?

linkedpixel
Honored Guest
Hi all,

My setup includes a Geforce GTX 680 and two 144hz Asus VG248 monitors, each running via dual link DVI-D cables to the corresponding ports on video card. This allows me to achieve the 144hz refresh rate not seemingly possible with HDMI, which restricts the monitor to 60hz at 1080.

I've yet to hook the DK2 up (having just upgraded my main machine to Windows 8.1) but this has got me wondering. Will there be any performance problems if I connect the HDMI directly to video card without using the included HDMI to DVI adapter? Can the HDMI built into DK2 actually support 75hz at DK2 resolution? Has anyone tried both ways to see if there are any performance discrepancies between native HDMI vs. adapter? Thanks.
1 REPLY 1

datenwolf
Honored Guest
"linkedpixel" wrote:
Will there be any performance problems if I connect the HDMI directly to video card without using the included HDMI to DVI adapter?


HDMI is essentially single link DVI with a different connector. So anything that goes for single link DVI also goes for HDMI. The adapter is just a HDMI connector wired back to back to a DVI connector with no active components inbetween. The maxiumum refresh rate is a function of the pixels to be transmitted and the capabilities of the interfacing electronics on either end. Which brings us to…

"linkedpixel" wrote:
Can the HDMI built into DK2 actually support 75hz at DK2 resolution?

Obviously yes. In fact HDMI-1.3 has been specified with clock rates the easily support Full-HD @ 75Hz. No problem there.

The question is: Is you graphics card able to push the pixels at that rate over a DVI output? I'd put my money on yes, if it's not older than 8 years or so.