AMD GPUs In 2016: HDR, FreeSync Over HDMI And New Standards

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I do not understand how you are relating nits, brightness, to the rec.2020 speck. I think i have a good understanding of color spaces and such and i have NEVER used a nit value when talking about a color space. The only time i look at nits is the max brightness of a white point to see how good a monitor would do outside. What am i missing here?
 
for many years I already said 24/32bit(or 8bit) isnt enough for our eyes. It is an illusion created by Screen Display makers to make the common forks believe 24/32bit color is enough.

Even the BT.2020 which is a 10bit color is still a far cry from what a human eye can see. (CIE xy1931).

The ultimate holy grail of display should be 8K + 48/64bit color + 120Hz.
 
I like how most people have no idea what HDR really is and how it has no effect games today. "HDR" has been around since Far Cry 1 and Half Life 2. In the sense of giving a illusion of a image with bright and dark spots. So that has nothing to do with real HDR that photos create. Games don't need to do this trick since its already computer generated if that makes sense.

As for the other part of "HDR", which is color. Monitors have been supporting 10bit color since DX9 I believe. Only OpenGL and other apps that don't use DirectX requires a professional card to output 10bit. So technically I've been gaming in"HDR" for a few years now ever since I got my workstation monitor. Oh and on that point, you need a monitor that supports 10bit which is only Displayport 1.2 and HDMI 2.0a right now.

The average gamer will need to spend $700 on a monitor just to get the 10bit support. But I will say even though games are created using 8bit images (maybe the masters aren't) you can see a bit of a color bump compared to a standard monitor. though maybe because mine is calibrated it gives this feeling it "looks better"
 
I am still gaming on 1080p which I found more than enough. 4K monitors are too expensive and not really ready for gamng yet but again why do I need 4K on screen where 1080p is just enough.

My college professor once said, if you feed a man a burger his entire life he will be perfectly happy, but if you give him fillet mignon he will never go back to regular burger meat.

Very true. I had a slow 1080p e-IPS and once I used a fast 1080p TN panel there was no going back (for fps gaming). However, I preferred my e-IPS for movies. Then came along the 1440p IPS G-Sync. I was in shock and awe at the jump. No going back, but now I'm used to it. I can only go up assuming technology can support it.

I don't like being an early adopter. All those that bought 4k fall in that boat. There isn't enough content or support for it. That will all change in the upcoming years, which is when I'll jump to 4k.
 
I am still gaming on 1080p which I found more than enough. 4K monitors are too expensive and not really ready for gamng yet but again why do I need 4K on screen where 1080p is just enough.

My college professor once said, if you feed a man a burger his entire life he will be perfectly happy, but if you give him fillet mignon he will never go back to regular burger meat.

Sure he will, when he only has $5 to spend. Your professor might have caviar tastes, but sometimes you only have a beer budget.
I would just spend my caviar budget on beer :)

4K isn't really that out of reach now, it just depends if you prefer image quality or high fps. You can spend the same money on a 1080p 144hz gsync monitor as on a 4k 60hz monitor.
 
People crying about 4K probably have not tried anything more than 1920x1080. Its so nice to have more space. Im only on 2560x1080 and you will not see me go back.

There is much more to this world than gaming. Why people have a hard time with that beyond me.
 
I am still gaming on 1080p which I found more than enough. 4K monitors are too expensive and not really ready for gamng yet but again why do I need 4K on screen where 1080p is just enough.
Same here, plus I game pretty casually (more of a living room setup) so I don't think I could even see the difference between 1080p and 4k except by moving too close. This is why I'm not too bothered about 4k on HDTVs smaller than 40" or-so.


I'm more interested in what HDR is actually supposed to be; the preview image of SDR vs HDR just looks like the image on the left is on a screen that isn't calibrated properly, or that the image hasn't had its levels or colour profile set correctly. As far as I can tell HDR sounds a lot like the "auto levels" option on many photo editors, which forces an image out across the full range from 0 black to full white. If that's the case then it's not something I want, as sometimes an image simply doesn't have full black or full white in it, and forcing it to do-so ruins it; I can only imagine how badly you could mess up video with that.
 
1080P is great for 20-24"
1440P is great for 24-32"
2160P (4K) is great for 32-40"

Really what these cards may provide is simple: Mainstream Higher Mid Range GPUs (~ USD $300) that can run current and future (say 2 years ahead) demanding games at 3840x2160 @ 60Hz with the potential of upper range cards to push to 120Hz whilst maintaining frame rates of 60-90fps.

As of today, no USD 300 card can promise as much. A USD 400 card might push the pixels at playable frames at high settings, but to play games at developer level detail settings at 4K you're either spending USD 500 right now or waiting until 2016.
 
People crying about 4K probably have not tried anything more than 1920x1080. Its so nice to have more space. Im only on 2560x1080 and you will not see me go back.

There is much more to this world than gaming. Why people have a hard time with that beyond me.


I love my 1440p 27inch. I'll do 4k at 32inch.
 
I bought a 4k IPS monitor 2 month ago and planned to stay at 1080p for gaming.

BUT why?

- A 10 Bit IPS Panel looks so much better than an 8 Bit TN panel (which in fact are only 6bit+FRC).
- 4k at low settings often looks by far superior to 1080p with ultra settings.
- if you do not play the newest Tripple-A Action games: Most games work just fine at 4k with high settings even with a middle class GPU such as the 270X.
- There is a huge difference between a 4k IPS Monitor and a FHD-TN Monitor.

Regarding the article:
- You don't really need Freesync or HDMI with the Full HD Monitors
- Display port 1.2 works well for 4k
- 10-bit per channel is supported via driver update for AMD cards since Nov 2014 for any cards above Radeon 7870
I don't see a big bang here. Just a different way to market this feature and call it "HDR."
The main problem are the monitors, that are mostly 6bit-FRC TN-Panels - HDR will just be useless with the current gaming type of monitors.


 
I am not on board with the 4k hype either. Yes it looks great, and all, but still way too pricey. 1440p is getting pretty close to being a reasonable alternative.

I woulda agree with you, but right now, full 4k monitors can be found quite literally for the same cost as 1440p panels, and I don't mean just the TN panels, ful ips monitors can be had.
By next GPU release, I think I'll get a new GPU. was just gonna get another 290/390 for Crosssfire, but after hearing that only 10% of this quarter's AAA titles have even supported sli/CF... think I'll spend the greater $$$ and get the fastest single card.
 
I am still gaming on 1080p which I found more than enough. 4K monitors are too expensive and not really ready for gamng yet but again why do I need 4K on screen where 1080p is just enough.

Right? Who needs progression? Why ever get a new car, when my pontiac from the 90's still runs. Why ever get a new house or bed, mine still works? Why play fallout 4 when I can get fallout 3 for $5?
The point is that new technology is always a good thing, and you should not bash or unencourage others from doing so just because of your unwillingness/inability to do the same.
 


Nowhere in his statement did he bash or discourage anyone nor attempted to halt progression. It just so happens that a lot of people share his sentiment that 1080p is enough and 4k is still too expensive hence impractical.
Not everybody can or is willing to spend thousands of dollars for a 4k panel and the accompanying hardware to make sure it runs perfectly.
 
I'm with Mr. Paw. Gaming on a big 1080p Plasma TV is sweet. 2160p is nice but not ready for prime-time in gaming. IPS panels ain't all that ---- they're great for color reproduction when properly calibrated but no one gives a crapola about that when gaming.

And --- they don't come in 60-inch models 😀
Err, my 60-inch LG from a few years ago is an IPS display...actually, most all LG's are IPS so i don't know what you're talking about.
 
Getting 4K and gaming at 1080p is actually pretty attractive. You don't need as much GPU power as running natively on 1440p, and you can still benefit from the high resolution in non-gaming scenarios. Getting Gsync makes it more expensive, though, and that may be a reason to stick with a 1440p display. Up to your needs.
 
I am gaming on 1080p and will most likely stay there for quite a while.
This is because Plasma color fidelity is too good compared to LEDs (OLEDs are not yet reliable) to use LEDs, even if 4k vs 1080p.

That being said, its hard to even get a plasma nowdays so im guessing the push will be for 4K.

This is AMAZINGLY good for me. While ppl will have to buy the most expensive GPUs to play 4k ULTRA details Ill be buying budget GPUs and still playing on ULTRA.

It took years to go from 720p to 1080p as a standard (TV still did not catch up, but thats nothing surprising as public administrations always stay behind decades of the mainstream).

Looks good so far to me, but untill AMD and nVidia come up with something incredible (VR maybe?) Im not jumping of the hype train.
you need to get with the times 1080p wasn't even a thing except for tv's i've still got my 21" viewsonic tube that's max was 1600x1200@75hz
florescent light sucks it does nothing but hurt my eyes and give me headaches hence a greater part of office workers and school childrens dread for those building. there's only so much flicker eye strain your eyes can be raped with.

plasma is good, until you get the image burn in and plasma is notorious for burning out after it's first year and rarely made it past 3 years of life.

i like plasma but it's costs are too prohibitive for it's short life span and burn in problems.

2 things that need to die are the florescent lighting and 60hz. the minimum spec should have been 75hz since the death of crt monitors.
1080p is all sony's fault and never should have existed when 1200p had 1080 beaten like a dead horse before 1080 was announced. seriously, we've all had 1200p for crts since 1998 1080 has been nothing but an industry crippling deformation whose only purpose was to milk every penny out of every brain dead user. florescent lighting was just the stamp of CEO bonus greed " it uses less power! brilliant i will pocket the savings AND build a gold statue of myself for my back yard! severe eye strain and headaches? not my eyes i don't care!"

designing anything for 60hz is a joke 60hz's is okay for 15fps
 


With the whol Gsync vs Freesync war going on, and news of coming innovations releasing monthly at this point in time, I'm waiting until after I buy the next GPU first. If I go Nvidia or AMD, or who knows, Freesync may become the only standard. I don't feel like now is a stable time to upgrade anything without having immediate regrets in the spring time of 2016 when gaming for 4k will truly be a possibility.
 
Im going to tell you all that are gaming on 1080p screens 4k+ is amazing 1440 is even an impressive step. I wont go back to 1080p screens after this. It is truly amazing, and trust me if your content with 1080p your really really missing out.
 
With me using dual monitors, 1440p and 4k just are not affordable options. Also most of those monitors tend to be larger. I am not sure I can safely fit dual 27" + monitors. My dual 1080p 24" samsungs take up a lot of room as it is.
 


Why not do an ultrawide, then? I swapped out two 24 in 1080p screens for one 3440x1440 panel at the office, and it's better all around - despite my initial concerns. It takes up much less room, and it's essenitally like having six (6) little 720 p screens worth of screen real estate at a pixel density where text and images render very well.
 
Status
Not open for further replies.