AMD Responds To Nvidia GPP Controversy

Status
Not open for further replies.

manleysteele

Reputable
Jun 21, 2015
286
0
4,810
14
For the life of me, I can't figure out how this decision tree by NVidia made it thru the company without someone stopping it in its tracks. It is ridiculous on its face.
 

hixbot

Distinguished
Oct 29, 2007
818
0
18,990
1
I never really had a problem with gsync, as it is a great feature and Nvidia GPUs are ahead by a mile in performance/watt.
but I really started to change my mind when I discovered that Display manufacturers are forbidden from adding freesync/vrr to a monitor that has gsync. I'm ok that gsync displays are more expensive, as I understand the tech comes at a cost, but manufacturers should have a right to add freesync as an additional feature. It is insane that a display can't be bought that includes both gsync and freesync. As a consumer this strongly colors my opinion of Nvidia, and not in a good way.
 

dudmont

Reputable
Feb 23, 2015
1,404
0
5,660
198
the genius at NVidia who thought this up, should have his stock options given to homeless bums immediately. GPP is stupid, will backfire, and probably would cost them market share, if AMD had the GPUs to make it happen. As it is, their supply situation sucks, so NVidia will survive this.
 

kinggremlin

Distinguished
Jul 14, 2009
574
41
19,010
0
Look in the mirror AMD. There's the reason gamers don't have a choice. When was the last time you had single gpu card that traded blows at the highest end without having to make comical ergonomic compromises? The 7970 Ghz? Not coincidently the last AMD/ATi video card I have owned after not owning anyone else going back to the original Radeon.

Stop over promising, under delivering and wasting time blaming others for your incompetency.
 

blppt

Distinguished
Jun 6, 2008
446
2
18,785
0
IMO, the problem has never been with the hardware that AMD puts out---my V64 w/c is a beast on paper--its always been about the drivers.

Comparing hard specs versus the 1080ti FE, is there anything that jumps out at you that would make the 1080ti take the V64 behind the shed and whoop its butt? Sure, the 1080ti has an extra 3GB of framebuffer space, but I don't think there's a single game that struggles with 8GB even at 4K res. Yet, the ti owns the V64 in every single game.

Drivers, drivers, drivers.
 

shovelroud

Commendable
Apr 17, 2018
3
1
1,510
0
I would have thought these type of practices violate anti-competition laws in numerous countries?
When any company leverages its might to restrict another company's trade it should be illegal. History shows that competition breeds innovation and benefits end users as a whole. Got a better product, prove it by performance/price, not by attacking your opposition and restricting their trade.
 

kinggremlin

Distinguished
Jul 14, 2009
574
41
19,010
0


Even when performance was comparable, their drivers were undeniably awful. It was the norm for them to announce software features that didn't make it into the drivers at release. The final straw for me was a driver update for my 7970 well after release that completely fubar'd multi monitor support for me. All you read on the forums now from AMD fans are how great their drivers are now. I have no idea how true that is, but without a highend competitive card, it doesn't matter to me any more.
 

bit_user

Splendid
Ambassador

That was my first thought, but then I realized these are all (?) Taiwanese-based manufacturers, and they probably have different laws and standards for anti-competitive practices.
 

Giroro

Honorable
Jan 22, 2015
801
202
11,390
13
I just want a Gsync monitor that doesn't cost 3x more than the freesync equivalent. In most cases (until recently) an AMD video card + freesync monitor ends up being cheaper than the Gsync monitor on its own.
 
NVIDIA, to give them for now the benefit of the doubt, may want to separate their brand more than it is now to help people buy with a level of trust.

It may or may not be anti-competitive in the way the rumor mills have spun. People do seem to jump on the band-wagon for stuff like this.

If companies are forced to choose a word like "Gamers" only for NVidia though that's wrong IMO as the implication is AMD cards aren't as good for gamers, so if companies are essentially strong-armed into that to me that's anti-competitive.

Again though, I'll wait for all the information that I can get access too.
 

Brian_R170

Honorable
Jun 24, 2014
288
2
10,785
0
You know, if it was Intel instead of NVidia, you can bet there would be lawsuits, investigations, and government fines.

But, I can sort-of see NVidia's reasoning. They believe that they have superior products and they don't want Company ABC to have a brand XYZ that can have either an Nvidia or AMD chip in it. Why? Because they think that brand XYZ on the box has a strong appeal in the minds of consumers, so consumers will be drawn to brand XYZ regardless of the underlying technology. There are many companies that do take advantage of branding to sell inferior products to mindless consumer all the time. However, what they're also saying is that they think PC gamers are of the same ilk when it comes to buying graphics cards. So, if you're a gamer, regardless of whether you prefer NVidia or AMD, that should piss you off more than this being some anti-competitive marketing practice.
 
HIXBOT,
I get your concerns if true that GSync monitors aren't allowed to add Freesync but that's possibly due to how the GSync module is implemented... the module replaces the scaler such that the video signal goes to the module in the monitor (proprietary NVidia) before showing up on the screen..

Now I suppose theoretically you could have a separate path with a normal scaler so if that was not allowed either I might agree that's not right...

Having said that it would be VERY CONFUSING to the specs when you might have a 60Hz panel for example with full GSync support (even below 30FPS via GSync module multiplier such as 2x22FPS = "44Hz")... but then in Freesync mode works at only 40Hz to 60Hz.

I can see why, at least initially, it would be important for a new feature to stand alone.
 

kinggremlin

Distinguished
Jul 14, 2009
574
41
19,010
0


Why would you think that? What sales restriction has been put in place? As Asus has just announced. AMD cards are now sold under a different branding. There is nothing illegal about having company exclusive branding. If Nvidia required Asus to put "Not recommended for gaming" or "Not designed for gaming" on Arez boxes, then there would definitely be some issues.
 
To this day I think AMD buying ATI was stupid. ATI was in bankruptcy they didn't need to pay as much as they did for it. ATI always had driver issues. I still have nightmares from time to time about a ATI customer trying to find the right driver for excel. Back in those days they would mail you the driver so a driver fix took like two weeks not minutes like today.
 

Joe Black

Honorable
Jul 3, 2013
88
0
10,640
1
Nothing new here. It's practically the force in G-Force. The G might stand for Graphics. Might stand for Gamer. Might stand for Garden. My money is on the latter.

They are always trying to force you into their walled garden. That's why they have so much money. It's hard to get back out from behind that wall.
 

blppt

Distinguished
Jun 6, 2008
446
2
18,785
0
"All you read on the forums now from AMD fans are how great their drivers are now. I have no idea how true that is, but without a highend competitive card, it doesn't matter to me any more."

They're pretty *stable* now, but there are few games that don't run better on my 1080ti in comparison to my V64 w/c, and usually its not nitpicking either, its pretty noticeable. So stable, yes, and their Crimson interface is 1000x better looking than nvidia's ancient control panel, but the drivers just don't seem to be very well optimized for new games. Looking at the specs on paper, there's no reason there should be a night-and-day difference between the 1080ti and V64 w/c in games. Especially when I'm running at a fairly low resolution (1080p) and Vsync.

I'm through blaming Gameworks for all these issues---these tessellation saturating libraries have been around a long time now, and AMD has had ample time to optimize for this trend in their software or hardware.
 

blppt

Distinguished
Jun 6, 2008
446
2
18,785
0
"Now I suppose theoretically you could have a separate path with a normal scaler so if that was not allowed either I might agree that's not right..."

Wasn't there a manufacturer rumored to be doing just that a while ago? Having both FS/GSync in one monitor? What ever happened to that?

NVidia goons got to them, lol.
 

redgarl

Distinguished
Nvidia is getting on my nerves since they said they are going AI... but didn't do jack except pushing GPU sales for AI research. They have no clue what to do with Unmanned vehicles and are pushing video analysis to render autonomous functions.

That's the company we are looking at right now, the cool company acting like Intel did back then. AMD needs to sue them.

I hope that AMD understand how Navi is important. At 7nm, that thing could become really funny sooner than we expect.
 

Math Geek

Champion
Ambassador
i would have done it the other way. left the well known "gaming" "ROG" and the others as AMD and then renamed the nvidia parts.

such as MSI 1060 model 1, 1060 model 2 and so on to denote the various levels of cards from them. keeps in line with nvidia's demands and makes generic and rather stupid names for the nvidia cards as a bit of an FU back at nvidia. course that's probably why no one at any of these companies asked my opinion on this in any way :/
 

mihen

Reputable
Oct 11, 2017
386
41
4,840
16
I think the biggest issue AMD faces right now in GPUs are game developers supporting nVidia's proprietary technology. For instance in Fallout 4, if you turn off nVidia god rays on AMD systems, your performance jumps up. I don't think people understand how much developers rely on the money they get from nVidia to gimp their competition.
AMD drivers are more stable and more timely in the last few years unless you consider your GPU bricking a mild hiccup. Their performance is better when you utilize more neutral measures like DX12 titles.
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS