4096 x 2160 vs. 3840 x 2160 for HDTV

rasmasyean

Distinguished
Mar 15, 2008
264
1
18,795
OK, I'm in the market for a 4K TV to go with my existing HTPC. When shopping for video cards I noticed this "mismatched standard" for 4K. It looks like most video cards use the 4096 version for their max resolution. Should I get this? Will there be a problem setting the resolution to match the TV @ 3840? Also, what's with this? Is this sort of like 1080p vs. 720p repeat of history?

Thanks!
 
Solution
Thunderbolt is Intel. Apple was first to use it. Now they are up to the third version of it. It has never had affordable cables, so it is a very niche technology. With the Z170 motherboards, Intel has merged dual 4K DP (DisplayPort) 1.2 support, USB 3.1 10Gb and Thunderbolt 3 into a single USB-C connector.

Computer ports with Thunderbolt 3 provide 40Gbps Thunderbolt – double the speed of the previous generation, USB 3.1 10Gbps, and DisplayPort 1.2. For the first time, one computer port connects to Thunderbolt devices, every display, and billions of USB devices. In Thunderbolt mode, a single cable now provides four times the data and twice the video bandwidth of any other cable, while supplying power. It’s unrivaled for new uses...

viewtyjoe

Reputable
Jul 28, 2014
1,132
0
5,960
There was an article fairly recently over UHD vs 4K and so on. As long as your GPU supports the resolution of your monitor/TV, all will be well. The main problem is that the 4K standard is defined at a wider aspect than current full HD (1920x1080), so you have some stuff that supports the proper 4K resolution, 4096x2160 or whatever it is, but a lot of the monitors and TVs are at double full HD, 3840x2160.
 
4K TV's and monitors use 3840x2160 resolution. A video card with 4GB or more video ram can draw the image in its memory, and then downsample that back to 3840x2160 to generate an image with less or no jaggies. 4K is already so high resolution, that it is almost jaggies free anyways.
 

rasmasyean

Distinguished
Mar 15, 2008
264
1
18,795


I thought all video cards do "p". Is there a difference in A x B resolution and Bp (or Bi)?
So should I get a video card with an hdmi port in additional to DVI? Is that an indicator it's compatible with "UHD"?
Also, will this work with Wndows Media Center?
 

rasmasyean

Distinguished
Mar 15, 2008
264
1
18,795
Oh wait. It looks like DVI is 2560x1600 in most of the video cards I checked.
I suppose that's the max DVI ports are able to handle.
It looks like the higher resolutions are only available in HDMI and DisplayPort.
 


No that was Apple creating more proprietary BS. Crapple will never make standard ports/devices because they suck and have lost all ingenuity.
 

rasmasyean

Distinguished
Mar 15, 2008
264
1
18,795


I got the TV and had one of these cards:
SAPPHIRE DUAL-X Radeon R9 280 100373L 3GB 384-Bit GDDR5 PCI Express 3.0

It works, but it IS a bit choppy and makes the WMC menu "draggy". Youtube is a tad noticeably choppy as well but not as much. It's smoother on setting: 30p (vs. 60p). And for some reason, my TV says it's doing 30p when the AMD Catalyst says it's 60p. Is this a HDMI 2.0 cable issue? I just grabbed an existing short cable.

So I'm guessing it's not that the GPU isn't powerful enough but there's just too much data to move back and forth from the 3GB memory? Does this also have anything to do with the 30p on the TV end?
 
Thunderbolt is Intel. Apple was first to use it. Now they are up to the third version of it. It has never had affordable cables, so it is a very niche technology. With the Z170 motherboards, Intel has merged dual 4K DP (DisplayPort) 1.2 support, USB 3.1 10Gb and Thunderbolt 3 into a single USB-C connector.

Computer ports with Thunderbolt 3 provide 40Gbps Thunderbolt – double the speed of the previous generation, USB 3.1 10Gbps, and DisplayPort 1.2. For the first time, one computer port connects to Thunderbolt devices, every display, and billions of USB devices. In Thunderbolt mode, a single cable now provides four times the data and twice the video bandwidth of any other cable, while supplying power. It’s unrivaled for new uses, such as 4K video, single-cable docks with charging, external graphics, and built-in 10 GbE networking. Simply put, Thunderbolt 3 delivers the best USB-C.

Thunderbolt™ 3 – The USB-C That Does It All

“Thunderbolt™ 3 is computer port nirvana – delivering two 4K displays, fast data, and quick notebook charging”, said Navin Shenoy, vice president in Client Computing Group and general manager of Mobility Client Platforms at Intel Corporation. “It fulfills the promise of USB-C for single-cable docking and so much more. OEMs and device developers are going to love it.”

Users have long wanted desktop-level performance from a mobile computer. Thunderbolt was developed to simultaneously support the fastest data and most video bandwidth available on a single cable, while also supplying power. Then recently the USB group introduced the USB-C connector, which is small, reversible, fast, supplies power, and allows other I/O in addition to USB to run on it, maximizing its potential. So in the biggest advancement since its inception, Thunderbolt 3 brings Thunderbolt to USB-C at 40Gbps, fulfilling its promise, creating one compact port that does it all.

What that article does not tell you is that Thunderbolt 3 adds something like $40 to the cost of a motherboard. So it is very likely to be hard to find on all but the high end motherboards.

And even when you have a board with it, 3 meter copper cable limits, and still expensive optical cables add the the costs of the thing.
 
Solution