News EVGA Abandons the GPU Market, Reportedly Citing Conflicts With Nvidia

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Chung Leong

Respectable
Dec 6, 2019
463
162
1,860
0
What pile of money? EVGA's stated reason for bailing out of GPUs is that it is currently making losses on sales at current retail prices and manufacturing costs.
They could have continue to make some money licensing their brand to some no-name manufacturer. That's essentially free money, a nice paycheck coming in year after year. Now such an option is essentially off the table. All because the CEO wanted his 15 minutes of fame on social media.
 

InvalidError

Titan
Moderator
They could have continue to make some money licensing their brand to some no-name manufacturer. That's essentially free money, a nice paycheck coming in year after year. Now such an option is essentially off the table. All because the CEO wanted his 15 minutes of fame on social media.
Lending your company's name to a third-party isn't something that companies do any more than you lend your identity to strangers. As long as you plan to remain in business or stay alive, you have to keep your identity to yourself.

What EVGA could license or sell are its GPU branding (ex.: FTW) and designs, similar to how IBM sold the Think trademarks to Lenovo when it bailed out of PCs.
 

Chung Leong

Respectable
Dec 6, 2019
463
162
1,860
0
Lending your company's name to a third-party isn't something that companies do any more than you lend your identity to strangers. As long as you plan to remain in business or stay alive, you have to keep your identity to yourself.
Do you really think George Foreman has anything to do with the manufacturing of grills bearing his name? And Trump Steaks, Trump Vodka? Individuals and companies lend their name out all the time. Of course, companies don't like to invest advertising dollars into someone else's brand, so they make effort to transition away from the arrangement. I don't remember exactly for how long ThinkPads made by Lenovo bore the IBM badge. It was at least several years, as Legend/Lenovo was completely unknown in North America at the time. Microsoft kept the Nokia branding for less than two years as its own brand is well-known. Given that their smartphone business went belly up shorty after the change, one could argue that perhaps it wasn't a good idea. Whoever making Nokia phones now will probably keep using it for the foreseeable future. Can't remember the name of the company even I had one of their phones in my pocket for two years.
 
Do you really think George Foreman has anything to do with the manufacturing of grills bearing his name?
George Forman didn't become famous for making grills in his youth.
This would be like Foreman finding a skinny young white short girl (the complete opposite of him) and making pretend that that person is George Forman the famous boxer.
What you are saying is like how AMD made some mountain bikes, that's just a marketing thing.
 
Reactions: Metal Messiah.

Darkbreeze

Retired Mod
Yes, lending somebody your name for marketing purposes is a much different thing than allowing them to pretend to BE you. Which is what that would be if they licensed their name to somebody else, and in fact, it might not even be legal for that matter. Not if the company in question wanted to also be able to continue conducting other forms of business using that same name.
 

KyaraM

Notable
Mar 11, 2022
986
350
890
42
No bloody way... they sell the most among Nvidia's partners. Gonna go check out those videos in a bit.

I had read some time back that Nvidia was screwing over the AIBs in some manner or other, but dang...
Pretty sure others sell more... they are one of the biggest, but not the biggest vendor.
 

KyaraM

Notable
Mar 11, 2022
986
350
890
42
~80% of their revenue comes from gpu sales. Sounds very impressive, until I heard the part about barely breaking even from it all, and profits from power supply sales are far higher... that's wild.
Some of those power supply sales have to be from bundles/combos, so if gpu sales halt, there will be fewer psu sales.

They're losing money from RTX 3080 and up, but making it from 3070Ti and down. Isn't that backwards behavior?



Comments across this site's older ARC news articles, TPU and EVGA forums [and reddit - bleh!]... have given the impression that it isn't enough to have more players.

Even if 1st gen ARC drivers weren't a crapfest... [Their fault for trying to do too much with it.]

Even if AMD's past driver stigma suddenly disappeared, driver updates were more frequent, RT and AI scaling development improved, etc...
[They are working with a fraction of the budgets that Nvidia and Intel have - what are some people bloody expecting them to do??? It's great what AMD is managing to achieve with Ryzen and RDNA, but ~no...]

Many folks won't budge at all with a break in the duopoly; Nvidia would still have a significant lead over the other 2 competitors. The green mindshare is too strong, and I can do naught but read/watch and shake my head...
I would try out Intel cards if I didn't have a great card already. Reason I'm not buying AMD cards again is that they all had severe software and heat issues. Personally, I really want a third option so I'm not locked to a single brand (not talking about AIBs here, actually, but Nvidia/AMD) anymore and actually got a choice. No, as stated above AMD is NOT a choice for me due to bad experiences.
 
Reactions: cyrusfox

logainofhades

Titan
Moderator
They could have continue to make some money licensing their brand to some no-name manufacturer. That's essentially free money, a nice paycheck coming in year after year. Now such an option is essentially off the table. All because the CEO wanted his 15 minutes of fame on social media.
The CEO is in his 60's and doesn't want to deal with the stress of dealing with Nvidia, and wants to spend more time with family.
 
Reactions: Metal Messiah.

Chung Leong

Respectable
Dec 6, 2019
463
162
1,860
0
What you are saying is like how AMD made some mountain bikes, that's just a marketing thing.
What I'm saying is HMD Global (I googled the name) makes Nokia phones. If AMD ever decides to exit the CPU market, I'm sure the AMD name would find itself on someone else's CPU. Atari missing out on the chance to put its name on the NES (and later, the Genesis) is regarded as one of the biggest debacles in tech history.
 
What I'm saying is HMD Global (I googled the name) makes Nokia phones. If AMD ever decides to exit the CPU market, I'm sure the AMD name would find itself on someone else's CPU. Atari missing out on the chance to put its name on the NES (and later, the Genesis) is regarded as one of the biggest debacles in tech history.
And while googling it you didn't see that they were nokia in the past, sold nokia to MS and then bought nokia back?!
If evga sells the company to someone else then yes, that someone else can do anything they want. But we are not talking about evga selling the company here.
https://en.wikipedia.org/wiki/HMD_Global
The company is made up of the mobile phone business that Nokia sold to Microsoft in 2014, then bought back in 2016.
 

InvalidError

Titan
Moderator
Do you really think George Foreman has anything to do with the manufacturing of grills bearing his name? And Trump Steaks, Trump Vodka? Individuals and companies lend their name out all the time.
The impeached former president didn't lend his name to vodka and steak, his companies ordered steaks, vodka, etc. to their specifications from manufacturers, then slapped his name on them before eventually filing for bankruptcy. Most PSU vendors don't make their own PSUs either. They send their requirements, specs, cosmetic designs, etc. to Seasonic, CWT, Delta, etc., the vendor checks that their requirements were met and then sell the stuff.

Celebrities lending their name for marketing purposes doesn't carry the same weight since there rarely is any reason to believe the person lending his name had much to with the product beyond the apparent endorsement. Their name's worth and marketability can still be ruined by a bad partnership.
 
Reactions: Metal Messiah.

King_V

Illustrious
Ambassador
Reason I'm not buying AMD cards again is that they all had severe software and heat issues.
...
No, as stated above AMD is NOT a choice for me due to bad experiences.
Yeah, might want to temper that first statement and stick with just the last one. The last one speaks to your experiences. The first one states that every AMD card made had these issues. That is false.
 

Eximo

Titan
Ambassador
A while back they stated they would start making AMD cards, maybe that didn't go over well with their current contract when it came to renewal.

Still I find this to be devastating. Like watching wayne gretzky or micheal jordan retire.

I feel like this is only temporary.
EVGA made a few AMD Motherboards, never GPUs as far as I can recall. That would be one way to transition, try and become the top producer of AMD motherboards for AM5.
 
Reactions: spentshells

KyaraM

Notable
Mar 11, 2022
986
350
890
42
Yeah, might want to temper that first statement and stick with just the last one. The last one speaks to your experiences. The first one states that every AMD card made had these issues. That is false.
Errr... no, it doesn't? It's called context. Which can be inferred by reading my statement. Though some of the cards I had were from the time when AMD actually did have huge software issue over a broad range of their cards, so even in that different context the claim wouldn't be completely false... and their cards run too warm for my tastes even today. But I digress.
 

Phaaze88

Titan
Ambassador
I would try out Intel cards if I didn't have a great card already. Reason I'm not buying AMD cards again is that they all had severe software and heat issues. Personally, I really want a third option so I'm not locked to a single brand (not talking about AIBs here, actually, but Nvidia/AMD) anymore and actually got a choice. No, as stated above AMD is NOT a choice for me due to bad experiences.
That's your experience, as unfortunate as it was(sorry to hear that), it doesn't make or mean the entire brand is bad for everyone else, presently or later on.
What about those currently using Radeon without a hitch? Some have bad experiences with Nvidia too, though underrepresented.
There can be heat issues with models from both sides, so that one is a bit lost on me.

Will you still lock yourself to one option(Nvidia) if Intel has those same issues with future generations?
 

JamesJones44

Prominent
Jan 22, 2021
222
143
760
0
I believe they have a board of trustees, right?

They are ultimately responsible for corporate governance, but you're right, the final decision I believe remains with him.
That depends, not every private company has a board. It depends on the structure. In most cases, when a private company is still run by the founder(s) the board can't unseat them like they can when the company is public. Who knows though, maybe he's sold away his share to private equity investors and they will throw him out, it's hard to know with private companies.
 
Last edited:
Sep 10, 2022
113
17
95
1
Yeah, might want to temper that first statement and stick with just the last one. The last one speaks to your experiences. The first one states that every AMD card made had these issues. That is false.
I own 6800xt, gpu hotspot never runs above 85'c pushing hard on FS - Ultra QHD setting, now that I undervolt it, it's actually boost more, and only run the average of 80-82'c hotspot temp, while the gpu core temp itself stays within 60-70s degree, in PUBG, 1440 qhd ultra-setting, it runs about the same temperatures as well. Playing CSGO, my new fav game, a few days into it as noob, my GPU only runs only at 48-52'c temp with hotspot reaching 58-60'c occasionally.

In term of driver, AMD is more solid on what stated as WHQL not optional on their website, the optional tend to have issue not all users will be affected but I was affected and I learned about that quickly, weeks with WHQL driver as per their support website, never encounter drama not once. It's just, when we agree being the beta testers by keep updating the last drivers that ain't even WHQL certified, just beware, we may encounter issues. AMD always stated this on their RELEASE NOTE, always! Issue here, issue there, just beware of that, wanna run issue free? Simple, use what's known as the last stable driver. It's just like BIOS, people willing to update and install beta version, to complain about issues, are just not smart. LOL
 
Last edited:
Sep 10, 2022
113
17
95
1
That's your experience, as unfortunate as it was(sorry to hear that), it doesn't make or mean the entire brand is bad for everyone else, presently or later on.
What about those currently using Radeon without a hitch? Some have bad experiences with Nvidia too, though underrepresented.
There can be heat issues with models from both sides, so that one is a bit lost on me.

Will you still lock yourself to one option(Nvidia) if Intel has those same issues with future generations?
If opinion is not biased, NVIDIA tend to have a lot higher temp due to the nature of its boost, it just boost boost boost I see NVDIA owners complain about temperatures everyday. Especially on OCN, they don't OC their card but it's just keep boosting and getting HOT. LOL
 

JarredWaltonGPU

Senior GPU Editor
Editor
Feb 21, 2020
1,411
1,441
4,070
1
If opinion is not biased, NVIDIA tend to have a lot higher temp due to the nature of its boost, it just boost boost boost I see NVDIA owners complain about temperatures everyday. Especially on OCN, they don't OC their card but it's just keep boosting and getting HOT. LOL
The biggest problems for Nvidia was GDDR6X temps, not GPU temps. Partner cards were way better than Founders Edition for the most part (though some AIB cards sucked even worse, true). I replaced thermal pads on a Gigabyte Vision 3080 and the 3080 FE, both dropped VRAM temps 10-20C. I rarely see GPU temp on 3080/3090 go above 80C, and often closer to 70C.

Now, if you fire up Ethereum mining (back when that was a thing), Nvidia GDDR6X temps shot up to 110C on many cards, definitely on the Founders Editions. I even tried replacing thermal pads on the 3080 Ti FE VRAM, and while it helped a bit, clearly that cooler design wasn't keeping up with 12 GDDR6X chips pumping out the heat.

In fact, GPU temps did go up on all the cards where I replaced thermal pads on the VRAM. That makes sense, as suddenly the VRAM was able to dump more heat into the cooler and that affected the GPU. But the GPU temps were still in the safe range.
 
The biggest problems for Nvidia was GDDR6X temps, not GPU temps. Partner cards were way better than Founders Edition for the most part (though some AIB cards sucked even worse, true). I replaced thermal pads on a Gigabyte Vision 3080 and the 3080 FE, both dropped VRAM temps 10-20C. I rarely see GPU temp on 3080/3090 go above 80C, and often closer to 70C.

Now, if you fire up Ethereum mining (back when that was a thing), Nvidia GDDR6X temps shot up to 110C on many cards, definitely on the Founders Editions. I even tried replacing thermal pads on the 3080 Ti FE VRAM, and while it helped a bit, clearly that cooler design wasn't keeping up with 12 GDDR6X chips pumping out the heat.

In fact, GPU temps did go up on all the cards where I replaced thermal pads on the VRAM. That makes sense, as suddenly the VRAM was able to dump more heat into the cooler and that affected the GPU. But the GPU temps were still in the safe range.
I have a 3080 FTW3 card and the temps regularly hit 80c while playing games in my small room. The ambient temp sensor (placed in the front intake of my case) in my Cooler Master H500m regularly reads 29-31c. My PC is in a 11x12 foot room and the temp outside is usually a 70-80 F. I live on the coast of So-Cal. With all that being said I have about 300-400w of tech running at idle (3 monitor, PS4, 5.1 speakers). I have found that on my 3080 if I place little heatsinks (these are the ones I used) on the top of the backplate over the approximate areas of the memory chips the overall temps of the card are reduced massively by about 7c when running games and about 12c from idle. I would assume 3090 layout cards with memory chips placed directly on the back of the card and in contact with the backplate would be even more effective.
 
Last edited:

King_V

Illustrious
Ambassador
Out of curiosity, what are the dimensions of an individual one of those heatsinks? They seem overall cube-shaped, yet the listed dimensions LxWxH are listed as: 3.94 x 1.97 x 1.18 inches, which seems WAY off.
 
Out of curiosity, what are the dimensions of an individual one of those heatsinks? They seem overall cube-shaped, yet the listed dimensions LxWxH are listed as: 3.94 x 1.97 x 1.18 inches, which seems WAY off.
The ones I linked are not the exact ones I personally had laying around and used but they seem to be 14mmX14mm squared. I could not figure out what the ones I used were because there are no identifying markings on them and I no longer have the packaging. The ones I specifically used are about 1 inch by 1 inch perimeter at around 3/4ths inch tall fins of aluminum.
*I found the ones I used, or they are so close to what I have they mind as well be the same and updated the post I linked them in.
 
Last edited:
Reactions: King_V

Phaaze88

Titan
Ambassador
If opinion is not biased, NVIDIA tend to have a lot higher temp due to the nature of its boost, it just boost boost boost I see NVDIA owners complain about temperatures everyday. Especially on OCN, they don't OC their card but it's just keep boosting and getting HOT. LOL
When there's factors such as differences in process nodes and total board power, how is that a fair comparison?

The smaller the node > greater thermal density > more difficult to cool(may warrant more specialized cooling solutions).

Among the current halo(6950XT) Radeons, many have their total board power capped between 300-330w. Nvidia has matched and passed that with the 3070Ti...
Memory doesn't use much power at all - in the single digits(w), but the higher board power SKUs see them bathe in more heat. GDDR6X having higher operating thermals than R6 doesn't help.
 

SunMaster

Proper
Apr 19, 2022
81
63
110
0
When there's factors such as differences in process nodes and total board power, how is that a fair comparison?

The smaller the node > greater thermal density > more difficult to cool(may warrant more specialized cooling solutions).

Among the current halo(6950XT) Radeons, many have their total board power capped between 300-330w. Nvidia has matched and passed that with the 3070Ti...
Memory doesn't use much power at all - in the single digits(w), but the higher board power SKUs see them bathe in more heat. GDDR6X having higher operating thermals than R6 doesn't help.

There's a reason the gddr6x chips get got.

View: https://www.youtube.com/watch?v=HqbYW188UrE
explains gddr6x and power draw. It's absolutely not in the single digits watts.
 

ASK THE COMMUNITY

TRENDING THREADS