Question Switched a AMD card for a Nvidia card and now only got 8Bit+dither gradients?

janos

Distinguished
Jan 3, 2012
9
0
18,510
0
Hi there,

long story short, I worked over years with a AMD Radeon R9 290, which was ultumately this machines bottleneck - within this once "high end" constellation - from the very beginning and supposed to be upgraded in the future. Well, the future happened meanwhile and I decided to take things to another level, after years of dodging any anti-aliasing option in games and being limited to 1080p despite my monitor is capable to perform WQHD at least.

PC focus: Graphics editing (Photoshop: 90% / Maya: 10%) & Gaming

Specs:
  • CPU: i7-5820K CPU @ 3.30GHz
  • GPU: MSI GeForce RTX 2080 SUPER 8gb RAM (before that: AMD Radeon R9 290, 4gb RAM)
  • MB: Gigabyte X99 Gaming 5
  • RAM: 32Gb
  • OS: Win7 64Bit (just for reference, please save any discussion on this) :)
  • Monitor:
    --AOC AGON AG322QC4 80 cm (31,5") Curved Monitor (HDMI, DisplayPort, USB Hub, Free-Sync 2, HDR 400, 4ms, 2560x1440, 144Hz)
    -- And some secondary Monitor, barely worth mentioning, just for reference (Samsung Synch Master type)
So I'm experiencing heavy forms of color depth loss right now, no matter what picture I open up right now, if it meets certain color merges, it becomes dithered.
A short time long I beliefed it might be the obvious "Gradient Banding" Nvidia is too lazy to fix, while it doesn't affect AMDs at all. But despite not being an expert I drifted to the point in testing solutions that convinced me more and more that it in fact is a color depth problem. Well, this could be very much the same souce with another name tag, as I am no expert, so this is the reason why I'm here...

So to me it makes the impression, that since I got the Nvidia card mounted, I do get to see anything displayed in desktop environment (Browser, PS, ect) for only 8Bit color depth with Dithering function activated. I've found a suitable graphic online that should make the difference clear with comparison:



If I understood it correctly, the classical Gradient Banding looks more like the 8-bit gradient, while I got a mix of the first and the second version within the picture mostly when zooming into a Photoshop picture I edit. I don't know if this is displayed correctly for anybody else to compare, but to me it looks like the mentioned 8bit+dithered hybrid from the pic above.

Here's a suitable comparison (click to get to the larger version):



The first circle is the result I get at the moment, the second one is the result I initially got, when using my old GPU. I managed it to reconstruct the look in PS through setting the standard 8 Bit color depth to 16 Bit while using the new GPU. And I know what you'd think, but this is no perma solution, because converting several textures into DDS format could cause compatibility issues that way for one, (two) it's not enhancing the look of every graphic I made before, so the results could just look alike.. And three it's not solving the problem I keep having when looking at pictures/sites online. So as a freelancer graphics artist I have to make sure "what I see is what i get", not some color-twisted Nvidia problem.

As mentioned before, I have the deep impression, that Nvidia somehow messes with my desktop color depth, forcing it to become 8bit dithered, instead of being a 32Bit standard. As such PS files that are made classically in 8Bit, but looked like 32Bit now look like the hybrid 8Bit-Dither version of the file. The same time it doesn't make sense to me why this should be the case at all, so I rly need an insider POV about this...

I'm sorry for mentioning some of the facts and assumptions twice, I'm some kinda done with this mentally by now, after looking for a cure and testing wannabe-solutions all night long. If there's anything I can serve with to make the case any clearer, please let me know. You're my last resort here to find some land on this high(tech) sea.

Thanks in advance, guys.
Janos

Edit#1:
As a new user to Nvidia, I just found out that there're two different driver setups to install for your system. Game Ready and Studio. I for sure have the former one, as I'm having a hard time to find the Studio driver for a Win7x64 system at all... Wonder if that does the trick...
 
Last edited:

janos

Distinguished
Jan 3, 2012
9
0
18,510
0
Update #1 (@ Edit#1)
There are only W10 versions of the studio driver, so no way this is gonna work. Is there a possible way to get both graphics cards to work within one PC? I mean I have severe doubts about that, besides the fact that they consume no tiny amount of power.. But would it be possible to define the Nvidia running the ingame content, while the old AMD is told to rest as long as there's no Photoshop process active?
 

janos

Distinguished
Jan 3, 2012
9
0
18,510
0
Okay, can someone please tell me, if I'm at the right sub forum at all? Bc, I mean, this topic is getting less traffic than half-baked openers that don't provide the minimum intel...

Is this case too difficult to answer or what's wrong?
 

janos

Distinguished
Jan 3, 2012
9
0
18,510
0
A shame the other Tom's Hardware forum was closed, ppl were much more communicative, especially when being served with all the data beforehand...

I even started a new topic with a adjusted case as of now. Yet nothing, while sliding down into the unanswered amount of topics. Apparently there's no sense in bumping, as noone cares anyway.

Remains to say that this is a very disappointing version of Tom's H.

Edit: For the records... Continued HERE with a different status quo and attempt.
 
Last edited:

ASK THE COMMUNITY