Advice on 4k gaming/hdmi/dvi-d adapters

Status
Not open for further replies.

Fatpig11

Honorable
Feb 12, 2014
12
0
10,510
Hey guys,

I've looked around a lot of threads and never seen this particular question answered, so I'm hoping to be enlightened. I'm looking to purchase a 4k monitor and almost all the ones I've seen only support hdmi cables. My gpu (sapphire r9 290x trix) only supports dvi-d. Now getting an adapter is no issue, and I understand that to a point hdmi and dvi are the same in quality, but my question is, due to the fact that hdmi is restricted to 30fps at 4k resolution, if I'm using an adapter which converts hdmi to dvi-d, will my screen bottleneck at 30fps or will I be able to use the screen as normal at 60. Obviously they aren't too cheap so I want to make sure I'm not stuck playing games at painful frame rates.

Thanks in advance!
 
Solution
HDMI and DVI are not the same thing. Yes, they are both digital, but DVI support much more bandwidth.

I think using a HDMI adapter to DVI would still have the same limitations that a HDMI->HDMI would have, but I am not positive. Maybe someone else can confirm.
 


It's a digital signal, what bandwidth? HDMI is an adaption of DVI, they are the same thing. It's like a 3.5mm and 1/4 inch phone connector.

HDMI is a newer digital audio/video interface developed and promoted by the consumer electronics industry. DVI and HDMI have the same electrical specifications for their TMDS and VESA/DDC links. However, HDMI and DVI differ in several key ways:

HDMI lacks VGA compatibility. The necessary analog contacts are absent in HDMI connectors.

DVI is limited to the RGB color range (0-255). HDMI supports RGB, but also supports YCbCr 4:4:4 and YCbCr 4:2:2. These ranges are widely used outside of (beyond) computer graphics, color rendering.

HDMI supports the transport of packets, needed for digital audio, in addition to digital video. An HDMI source differentiates between a legacy DVI display and an HDMI-capable display by reading the display's EDID block.
 


Do some more research. HDMI has limitations that DVI does not. For example, you can't get anything over 60hz with HDMI. Also, HDMI can only transmit 30fps at 4K (until we get HDMI 2.0).

When we start getting HDMI 2.0 in our video cards, then there will probably be no difference.
 


What are you talking about?
It's a digital signal, there's no limitation.
 


Wrong. Go do some research. HDMI will only do 30fps at 4K until we get HDMI 2.0.
 
What’s new in the HDMI 1.4 specification?
...
4K Resolution Support

The new specification enables HDMI devices to support extremely high HD resolutions, effectively four times the resolution of a 1080p device. Support for 4K allows the HDMI interface to transmit digital content at the same resolution as the state-of-the-art Digital Cinema systems used in many movie theaters.
...




Will any of the new HDMI 1.4 features require a new cable?
...
The HDMI Ethernet Channel feature will require a new cable that supports this functionality, either a Standard HDMI Cable with Ethernet or a High Speed HDMI Cable with Ethernet, depending on the maximum resolution to be supported. The Automotive Connection System will also employ a new class of cable, the Standard Automotive HDMI cable, which is designed specifically for automotive use. All of the other new HDMI 1.4 features will be compatible with the existing categories of cables.
...

http://www.hdmi.org/manufacturer/hdmi_1_4/hdmi_1_4_faq.aspx#1
 
It's a digital signal, it makes no difference.

"Impressively, all this goodness won't require new cabling. The connector is unchanged, and according to this page on the official HDMI site, 'Current High Speed cables (Category 2 cables) are capable of carrying the increased bandwidth.'"

It makes no difference how you connect to the videocard to the monitor.
 


Connecting the videocard to the monitor via DVI to HDMI makes no difference.
It will not bottleneck the FPS at 30 FPS.

Do you dispute that?
 


As I said in my first post, I'm not sure if the HDMI -> DVI adapter would still hit the same limitations that a HDMI -> HDMI connection would have.
 
Yeah, Alec, you are wrong. Generally for HDMI you will actually have to use 2 HDMI connectors and each one runs half of the display because HDMI doesn't supply enough bandwidth to carry full 4K video at higher frequencies.

Right on the HDMI FAQ you can read:

HMDI said:
What does 4K refer to?

4K is a term used to describe displays with resolutions that are essentially four times that of a 1080p device – or roughly 4,000 lines wide by 2,000 lines high. The HDMI 1.4 specification supports multiple 4K formats:

3840 pixels wide by 2160 pixels high @ 24Hz | 25Hz | 30Hz
4096 pixels wide by 2160 pixels high @ 24Hz
 


The limitation would be on the refresh rate of the monitor, if it only supports 30hz @ 4k, it will only refresh at 30hz.
But that won't impact FPS unless Vsync is enabled.

If the monitor lists support for 60hz@4K, than the physical cable or conversion wouldn't be any issue.
 


DVI does the same thing

https://www.youtube.com/watch?v=qu3LxxCStKs

The cable, itself, doesn't make any difference. Or the conversion. They send the same amount of data. The only issue is the limitation of the monitor itself. If it only supports 30hz, than you will only get 30 hz.
 


Doesn't that card have Displayport? Use that.
 


Let me guess.... you also think DisplayPort sends the exact same amount of data because it's digital as well?

Just stop. DVI and HDMI do not have the same bandwidth.
 


Also, you might want to keep in mind that most gaming at 4K is really terrible frame rates anyway, even with a 290x . . .

Look at things like:
http://www.eteknix.com/4k-gaming-showdown-amd-r9-290x-r9-280x-vs-nvidia-gtx-titan-gtx-780/9/

and:
http://www.tweaktown.com/tweakipedia/32/sapphire-radeon-r9-290x-tri-x-benchmarked-at-4k/index.html



 
Solution
Status
Not open for further replies.