News GeForce RTX 4080 Emerges at U.S. Retailer Starting at $1,199

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Is it really? We just went through almost 2 years of 3080's selling for almost $2000 and 3090's cresting over $3000. The 4080 is going to beat both of those, but this price is shockingly high to you?

For something like gaming, yes, that is shocking. $2,000+ is also shocking.

Businesses were the ones buying GPUs for $2k-$3k because they thought they could make a profit. For the most part, the only gamers getting 3080's and 3090's were celebrity influencers/streamers, who are typically shipped the gear they endorse directly from their sponsors, for free.
 
The whole world has gone mad and has lost sight of the real values of things. He forgot that the prices are not mandatory, but are subject to a mandatory deal. Compulsory prices were imposed by governments at the time when socialism ruled in some countries. Why the hell, when you go into a store, don't you make a deal with the cashiers, but pay the price on the tag?
 
The whole world has gone mad and has lost sight of the real values of things. He forgot that the prices are not mandatory, but are subject to a mandatory deal. Compulsory prices were imposed by governments at the time when socialism ruled in some countries. Why the hell, when you go into a store, don't you make a deal with the cashiers, but pay the price on the tag?
I do not know where you live, but here is the US negotiating prices in anything other than a boutique store, thrift store, or car dealership will have you offered a choice of buying the product at the the sticker price or asked to leave.
 
It really is time for many people who believe in democracy, freedom and how capitalism works to think about whether what they believe is true. :)
 
It really is time for many people who believe in democracy, freedom and how capitalism works to think about whether what they believe is true. :)
I have previously worked out how the premise of Capitalism that encourages innovation and competition is broken in this industry. Freedom has well established limits tested by the law, and we are a Republic, not a democracy. I do believe in the sentiment of your statement. For this industry to truly and properly heal it needs something to bring it out of the decades long decline to promote competition of every facet.
 
  • Like
Reactions: George³
I have previously worked out how the premise of Capitalism that encourages innovation and competition is broken in this industry.
Capitalism as a driver for innovation has been dead in pretty much all industries that have gone heavily vertically integrated oligopolies - you don't need to compete on delivering value per dollar to consumers when your consumers are a captive audience.
 
Does that materially matter to the consumer? Would you be more willing to pay $1200 for a scalped 4080 that had an $800 MSRP than $1200 for the same card with a $1200 MSRP?
It does, because you have no idea if the scalper is a legit seller or if the product has been damaged in any way shape or form; same as buying used (same risks). There's also a warranty thing where in some cases, it may not be transferable (speaking more broadly).

There are "material" downsides to buying from scalpers, not just the upfront cost of the thing you're trying to get.

Regards.
 
At least with cell phones you still have the option of getting Chinese phones that give you 80-95% of the flagship features and performance for $150-250, albeit at the expense of effectively having no warranty and few if any firmware updates.

True... Just hope that we will get Chinese GPUs in the market somewhere in the future. There are now some, but not seen in western market yet!
That would help situation somewhat.
 
At idle, a GPU doesn't really use much wattage. So when not gaming the amount it contributes is minor at best.

index.php
Just a personal opinion, but I think 22 W or 30 W is actually a lot to burn @ idle. This might sound weird, but it might be the reason I don't buy an Intel A770.

BTW, what the heck is going on with the RX 6400 idling at 10 W, while the RX 6600 sits at 6 W ???
 
This depends entirely on what you have connected to your graphics card. With nothing running on my computer it pulls 90-105 watts idle. This is because of 2 4k monitors and a 1440p monitor.
That's too much for just doing video output of a static desktop. Please confirm which card you're talking about, and how you measured that.
 
The People enabled this; they spoke with their money. It doesn't have to be everyone, but enough.
2017's x80Ti for 700USD? That was very good.
It was very good, but I think the only reason that happened is Nvidia was anticipating a repeat of RX Fury vs. 980 Ti contest, and priced it accordingly. If they had known the Vega 64 would only compete with the regular GTX 1080, I'm pretty sure the GTX 1080 Ti would've had a MSRP $100 to $200 higher. Especially if you look at what the Pascal Titan version was priced at, and it was barely faster than the 1080 Ti, nor did it have any special features like Tensor cores.

"Only money" talks - and well? 4090s sold, 4080s will too, and so on. The trends can only continue, as The People are okay with it.
What worries me is Nvidia's plan to shift wafer allocation to making more H100. As long as they can keep selling those, a slump in the gaming/mining market won't hurt them and they won't have to lower prices either. So, I guess the hope is that they ultimately reach saturation of H100 demand and still aren't sustaining enough revenue. Then, they might contemplate some price drops in their gaming GPUs. Though, I still think they're likely to focus as much re-pricing as possible on their 3000-series and try to maintain the high MSRPs of the 4000-series - especially premium cards that out-perform any 3000-series card.
 
IMO the problems arise when an industry has very little competition. If competition on performance existed then it follows that competition on price would follow. More products of similar performance and features in the same industry reduces the prices because of their similarity. The problem with this industry is that it is niche and constantly in the state of decline so nobody wants to break into it.
Intel and Imagination would beg to differ.

The reason the industry is resistant to competition is the extremely high investment you have to make, before you can truly compete. If Intel is having this much trouble, imagine what hurdles a less well-resourced entrant would face!

Building a competitive gaming GPU means that you not only have to get the hardware right, but you also need highly-optimized drivers. And what counts as a "driver" includes the shader compiler and game-specific optimizations. What's more, if you want support for APIs like OpenGL and Vulkan, you're basically on your own - you don't have Microsoft helping you, like they do with DirectX.
 
It was very good, but I think the only reason that happened is Nvidia was anticipating a repeat of RX Fury vs. 980 Ti contest, and priced it accordingly. If they had known the Vega 64 would only compete with the regular GTX 1080, I'm pretty sure the GTX 1080 Ti would've had a MSRP $100 to $200 higher. Especially if you look at what the Pascal Titan version was priced at, and it was barely faster than the 1080 Ti, nor did it have any special features like Tensor cores.
Quadro driver support?
 
That's too much for just doing video output of a static desktop. Please confirm which card you're talking about, and how you measured that.
I have an EVGA FTW3 3080 10gb on the quiet BIOS and as a test did a clean boot of win10 (latest version) with my 55 inch LG CX 4k tv at 120hz, my 27 inch Viewsonic Elite XG270QG at 120 hz, and my 31.5 inch Samsung 4k U32H85x 60hz monitor. I opened HW info and total GPU power for a 30 minute average of afk nothing happening on my computer was 99w average with an 89w minimum and a 109w maximum. Are you trying to tell me that I am wrong due to the monitoring software, or what? My total power pulled at the wall for my entire PC in this state also says 158 watts. Is that wrong too? No. Multi-monitor setups pull significantly more power in every setup I have used that I personally built.

Intel and Imagination would beg to differ.

The reason the industry is resistant to competition is the extremely high investment you have to make, before you can truly compete. If Intel is having this much trouble, imagine what hurdles a less well-resourced entrant would face!

Building a competitive gaming GPU means that you not only have to get the hardware right, but you also need highly-optimized drivers. And what counts as a "driver" includes the shader compiler and game-specific optimizations. What's more, if you want support for APIs like OpenGL and Vulkan, you're basically on your own - you don't have Microsoft helping you, like they do with DirectX.
I said specifically that the lack of competition was the reason for the industries woe's. I forwent the exact reason that you so aptly made because it would be the 3rd or 4th time I have made the same such post on forums in the past month and can no longer be bothered going into the explanation of why the competition sucks. Instead I cut the fat and stated the fact that competition is the reason to the issue.
 
Last edited:
Building a competitive gaming GPU means that you not only have to get the hardware right, but you also need highly-optimized drivers.
I'd say an even bigger obstacle is all of the patents involved. You likely cannot make a GPU worth a damn without potentially infringing on hundreds of hardware and software patents for every little detail that companies patented either defensively to protect themselves against frivolous lawsuits about other similarly minor details that shouldn't have been patentable in the first place or help to lock competition out of the oligopoly. If you dare to poke the lions without acquiring a sufficiently large patent portfolio to force a cross-licensing deal first, you are almost guaranteed to get sued into oblivion no matter how good your hardware and drivers might be.
 
I have an EVGA FTW3 3080 10gb on the quiet BIOS and as a test did a clean boot of win10 (latest version) with my 55 inch LG CX 4k tv at 120hz, my 27 inch Viewsonic Elite XG270QG at 120 hz, and my 31.5 inch Samsung 4k U32H85x 60hz monitor. I opened HW info and total GPU power for a 30 minute average of afk nothing happening on my computer was 99w average with an 89w minimum and a 109w maximum.
Thanks!

Are you trying to tell me that I am wrong due to the monitoring software, or what?
No, I'm just saying that it shouldn't take that much power to read framebuffer values from memory and send them out to a display. Even if the display is YUV. So, there's something going on that I definitely don't understand.

For reference, a little raspberry Pi can do this with 2x 4k monitors @ 30 Hz, and its GPU is 28 nm garbage. The whole device shouldn't use even 10 W to do that.

My total power pulled at the wall for my entire PC in this state also says 158 watts. Is that wrong too?
If the GPU figure is correct, then that sounds about right.

I said specifically that the lack of competition was the reason for the industries woe's. I forwent the exact reason that you so aptly made because it would be the 3rd or 4th time I have made the same such post on forums in the past month and can no longer be bothered going into the explanation of why the competition sucks.
You can put it in your profile, for easy access, and then just drop a link to it each time it comes up. Or just bookmark a link to your best post if it and drop that link, instead of repeating yourself.

Anyway, it's your speculation vs. mine. I know a little about graphics APIs and I know a GPU driver developer, so mine is somewhat informed. But, you're entitled to your opinions, even if they're wrong.
; )
 
Anyway, it's your speculation vs. mine. I know a little about graphics APIs and I know a GPU driver developer, so mine is somewhat informed. But, you're entitled to your opinions, even if they're wrong.
; )
We have come to the same exact conclusion, no? If I am wrong you are also saying you are wrong, no?

Intel and Imagination would beg to differ.

The reason the industry is resistant to competition is the extremely high investment you have to make, before you can truly compete. If Intel is having this much trouble, imagine what hurdles a less well-resourced entrant would face!

Building a competitive gaming GPU means that you not only have to get the hardware right, but you also need highly-optimized drivers. And what counts as a "driver" includes the shader compiler and game-specific optimizations. What's more, if you want support for APIs like OpenGL and Vulkan, you're basically on your own - you don't have Microsoft helping you, like they do with DirectX.
You said that the reason the industry is "resistant to competition" and then proceeded to explain the minutia of the industries woes. Are you or are you not agreeing that competition and the lack there of is the reason the industry is in such a state? Am I misinterpreting what you said? If you have your own take on the specific reason why the industry is why it is I would like to hear it