News AMD Slashes Price of 7950X3D and 7900X3D By Up To $100

Apart from the AMD's official store, even Amazon, Newegg and Microcenter are offering similar price cuts.

Microcenter is currently offering the 7900X3D chip for $549.99 US. The chip also has a $10-$15 US discount on other US-based retailers such as Amazon and Newegg.

Microcenter retailer is also offering small discounts on the rest of the AMD Ryzen 7000 Desktop CPU family but these are only available for an in-store pickup, but the price cut seems to be very less:
 
Last edited:
  • Like
Reactions: AgentBirdnest
Apr 1, 2020
1,445
1,099
7,060
Justify spending

$450 on a 7800X3D
$550 on a 7900X3D
$600 on a 7950X3D

Compared to $320 for a 13600K or $335 for a 5800X3D, especially if you don't have an RTX 4090 and/or don't play at 1920x1080.

WJYhhpyf9xoxoV9ASwNkiB.png
 

DavidLejdar

Prominent
Sep 11, 2022
244
144
760
Justify spending

$450 on a 7800X3D
$550 on a 7900X3D
$600 on a 7950X3D

Compared to $320 for a 13600K or $335 for a 5800X3D, especially if you don't have an RTX 4090 and/or don't play at 1920x1080.

For me it is easier to justify than to spend big on a smartphone or car. And I don't have a 4090, and I play at 1440p, so I know I wouldn't get a nominal FPS boost as in the chart. There are also other issues though, such as crowd density in open world games, and strategy and simulation games, where a CPU has some work to do which isn't only directly about FPS. (And in particular 13600K seems to consume more power for less performance, which is technically a cost factor as well.)

But sure, when someone plays just e.g. CS:GO now and then, such runs well enough even with older (and cheaper) hardware. And with a limited budget it sure may make sense to go e.g. for AM4 and spend a bit more on GPU instead.
 

sitehostplus

Honorable
Jan 6, 2018
380
156
10,870
How do I justify paying up to $800 for an AMD Ryzen 9 7950x3d?

Two words: electric bill. I don't need a 300+ watt Intel space heater raising my electric bill even higher than it is now.

The money I'll save now and in the future on electricity will justify the high AMD price alone.
 
Last edited:

kal326

Distinguished
Dec 31, 2007
1,229
108
20,120
Meanwhile I’m enjoying my 7900x, 32GB DDR5 6000, and x670 board for $589 Microcenter combo. MSRP is a good jumping off point, but this is what the market is actually paying. For some at least.
 
  • Like
Reactions: aberkae
The electric bill thing might need some context, because I hear this a lot but nobody seems to put any real numbers behind it.

So let's compare the i5-13600K to the Ryzen 7800X3D in terms of power consumption. Going by what TechPowerUp got in its numbers, the average power consumption across 12 games was 49W for the 7800X3D and 89W for the i5-13600K. Let's just put these in isolation for now, because the motherboard is a factor that can't be the same between them. So using just this number, the Intel chip has a power consumption factor of 1.816x

Let's say in a given week, you play games for about 30 hours. And then just say there's 4 weeks for a month.
  • If we went by the most expensive electricity rate in the US, that would be Hawaii at $0.49 per kilowatt-hour. So it would cost you $2.88/mo to run the AMD CPU, $5.23/mo to run the Intel CPU. It would take you about 55 months, or 4.58 years to break even.
  • If we went by the average electricity rate in the US, that would be $0.1372 per kilowatt-hour. So it would cost you $0.81/mo to run the AMD CPU, $1.47/mo to run the Intel CPU. It would take you about 197 months, or 16.4 years to break even.
  • If you happen to live in Nebraska with the cheapest rate of about $0.0935 per kilowatt-hour, well, I'm pretty sure you'd long forgotten about your PC before it had a chance to break even.
The only thing that this may negatively impact is the temperature of your room, but I doubt 89W is really going to make that much of a difference vs 49W. And if you have to run the AC longer... well if you're in Hawaii, you're probably running the thing most of the day anyway.

Also I think the funny thing here is this is the same argument that people made for buying Intel over the AMD FX chips.

EDIT: Before any one wants to go "well ackshually" on my numbers, this is a simplified example and I can't be bothered to account for when you use your computer for your use case and if you have something like different electric rates depending on the time of day.

If you want to crunch your own numbers, go right ahead.
 
Last edited:

eldakka1

Honorable
Dec 24, 2018
25
20
10,535
Where are you getting this info on the 7950x3d? I'm looking at AMD's store, and the price is STILL $699.

Don't just believe me, take a look for yourself.

Take a look yourself...
Ryzen 9 processors

7950X3D $599.00
7900X3D $549.99

ETA: make sure you are on the US store, as the price cuts don't apply (at least so far) across every region. It might take a week or 2 to be refelected everywhere - assuming it's a base price cut and not just a limited time special price.
 
The electric bill thing might need some context, because I hear this a lot but nobody seems to put any real numbers behind it.

So let's compare the i5-13600K to the Ryzen 7800X3D in terms of power consumption. Going by what TechPowerUp got in its numbers, the average power consumption across 12 games was 49W for the 7800X3D and 89W for the i5-13600K. Let's just put these in isolation for now, because the motherboard is a factor that can't be the same between them. So using just this number, the Intel chip has a power consumption factor of 1.816x
The biggest issue with the power draw numbers from the x3d parts is that gaming benchmarks are limited in scope and potentially fit fully into the cache so you get the power draw that corresponds to the CPU never having to go outside the cache.
We need some extensive research on how much of that translates to actual real world.
As things are we can only be sure about the power benefit for the benches.

Also the 12 game avg power draw for the 13600k in the link you posted is 74W and not 89.
It's 89 in the 7800x3d link.
Maybe it's due to the change from the 3080 to the 4090 .
 

slash3

Distinguished
Jan 16, 2008
33
20
18,535
AMD has not dropped the price of the 7900X3D and 7950X3D.

The pricing you are suggesting in your post (and was similarly parroted earlier by WCCFTech) is from one retailer - MicroCenter. They price certain components at discounted rates as a loss leader for in-store only promotions, and AMD's product list reflected this "lowest" pricing when viewing by category. Here's the 7900X3D's retailer availability as an example.

The 7950X3D was, and is still, $699.99. The 7900X3D was, and is still, $599.99. Clicking through to AMD's web store or other retailers will reflect this.

While I do feel that AMD's habit of displaying the lowest partner price for any given SKU on their overview page is misrepresentative, being an informed consumer and actually reading descriptions is a life skill which is extremely important. What's also problematic is publishing articles as statements of fact without doing due - or any - diligence. A simple click-through would have revealed that AMD hadn't made any change, and that the premise of the entire post is complete nonsense.

TL;DR: AMD didn't drop pricing. One store did, if you happened to walk in during their initial promotion period. No shipping.

You're killin' me, Smalls. Do better. Stay in school, kids.
 

abufrejoval

Reputable
Jun 19, 2020
333
231
5,060
Please guys, do me a little favor: take a 7950X3D, deactivate the high-clock CCD and re-run your benchmarks against the 7800X3D.

I can't think of any reason how the 7800X3D could win that race.

So if the 7950X3D is somehow worse at gaming, that's a software issue that urgently needs sorting out.
 

Elusive Ruse

Commendable
Nov 17, 2022
375
492
1,220
Based on the chip's incredible performance value, AMD has no choice but to discount its higher core count CPUs to keep them enticing to gamers.
I mean they had the choice not to drop the price, but it's good that they did. Any time a company discounts prices of their current gen products is a good time.
 

sitehostplus

Honorable
Jan 6, 2018
380
156
10,870
The electric bill thing might need some context, because I hear this a lot but nobody seems to put any real numbers behind it.

So let's compare the i5-13600K to the Ryzen 7800X3D in terms of power consumption. Going by what TechPowerUp got in its numbers, the average power consumption across 12 games was 49W for the 7800X3D and 89W for the i5-13600K. Let's just put these in isolation for now, because the motherboard is a factor that can't be the same between them. So using just this number, the Intel chip has a power consumption factor of 1.816x

Let's say in a given week, you play games for about 30 hours. And then just say there's 4 weeks for a month.
  • If we went by the most expensive electricity rate in the US, that would be Hawaii at $0.49 per kilowatt-hour. So it would cost you $2.88/mo to run the AMD CPU, $5.23/mo to run the Intel CPU. It would take you about 55 months, or 4.58 years to break even.
  • If we went by the average electricity rate in the US, that would be $0.1372 per kilowatt-hour. So it would cost you $0.81/mo to run the AMD CPU, $1.47/mo to run the Intel CPU. It would take you about 197 months, or 16.4 years to break even.
  • If you happen to live in Nebraska with the cheapest rate of about $0.0935 per kilowatt-hour, well, I'm pretty sure you'd long forgotten about your PC before it had a chance to break even.
The only thing that this may negatively impact is the temperature of your room, but I doubt 89W is really going to make that much of a difference vs 49W. And if you have to run the AC longer... well if you're in Hawaii, you're probably running the thing most of the day anyway.

Also I think the funny thing here is this is the same argument that people made for buying Intel over the AMD FX chips.

EDIT: Before any one wants to go "well ackshually" on my numbers, this is a simplified example and I can't be bothered to account for when you use your computer for your use case and if you have something like different electric rates depending on the time of day.

If you want to crunch your own numbers, go right ahead.

Here you go:

eZjEEwXo5GVL8qD88Ksyrb-1200-80.png.webp


That is from the Ryzen article. Notice how the high end I9 at max is using 220 watts, and the high end Ryzen 9 (the 7950x3d P80 UV model) is using 108 watts max.

That is like about half the power draw of the I9 part. So while you're Intel i9 part doubles as a space heater, mine will use half the power. And I bet I can save money on electricity.
 

atomicWAR

Glorious
Ambassador
Justify spending

$450 on a 7800X3D
$550 on a 7900X3D
$600 on a 7950X3D

Compared to $320 for a 13600K or $335 for a 5800X3D, especially if you don't have an RTX 4090 and/or don't play at 1920x1080.

And argument for socket longevity can be made for AM5. Besides being tired of Intel builds in general, socket longevity was the biggest reason I went with an AM5 socket this build. But if your building strictly on price your not wrong and if you don't plan to upgrade for 5 years (approx.) or so AM5 socket longevity doesn't add as much to the converstation assuming 2026/7 as EOL.
 
Here you go:

eZjEEwXo5GVL8qD88Ksyrb-1200-80.png.webp


That is from the Ryzen article. Notice how the high end I9 at max is using 220 watts, and the high end Ryzen 9 (the 7950x3d P80 UV model) is using 108 watts max.

That is like about half the power draw of the I9 part. So while you're Intel i9 part doubles as a space heater, mine will use half the power. And I bet I can save money on electricity.
So all you do with your CPU is run y-cruncher?
 
Apr 1, 2020
1,445
1,099
7,060
And argument for socket longevity can be made for AM5. Besides being tired of Intel builds in general, socket longevity was the biggest reason I went with an AM5 socket this build. But if your building strictly on price your not wrong and if you don't plan to upgrade for 5 years (approx.) or so AM5 socket longevity doesn't add as much to the converstation assuming 2026/7 as EOL.

After the whole "All socket AM4 motherboards will support all AM4 CPUs" fiasco that saw me have to drop $350 for a new motherboard for my 5950X (which itself doesn't hit stated speeds even with the per core curve optimizer) only for AMD to backtrack only due to needing a way to compete with Intel on price means I will never trust anything AMD says about longevity again, and I wouldn't be one bit surprised if come 2025 AMD doesn't support these 600 series boards.
 
Apr 1, 2020
1,445
1,099
7,060
The electric bill thing might need some context, because I hear this a lot but nobody seems to put any real numbers behind it.

So let's compare the i5-13600K to the Ryzen 7800X3D in terms of power consumption. Going by what TechPowerUp got in its numbers, the average power consumption across 12 games was 49W for the 7800X3D and 89W for the i5-13600K. Let's just put these in isolation for now, because the motherboard is a factor that can't be the same between them. So using just this number, the Intel chip has a power consumption factor of 1.816x

Let's say in a given week, you play games for about 30 hours. And then just say there's 4 weeks for a month.
  • If we went by the most expensive electricity rate in the US, that would be Hawaii at $0.49 per kilowatt-hour. So it would cost you $2.88/mo to run the AMD CPU, $5.23/mo to run the Intel CPU. It would take you about 55 months, or 4.58 years to break even.
  • If we went by the average electricity rate in the US, that would be $0.1372 per kilowatt-hour. So it would cost you $0.81/mo to run the AMD CPU, $1.47/mo to run the Intel CPU. It would take you about 197 months, or 16.4 years to break even.
  • If you happen to live in Nebraska with the cheapest rate of about $0.0935 per kilowatt-hour, well, I'm pretty sure you'd long forgotten about your PC before it had a chance to break even.
The only thing that this may negatively impact is the temperature of your room, but I doubt 89W is really going to make that much of a difference vs 49W. And if you have to run the AC longer... well if you're in Hawaii, you're probably running the thing most of the day anyway.

Also I think the funny thing here is this is the same argument that people made for buying Intel over the AMD FX chips.

EDIT: Before any one wants to go "well ackshually" on my numbers, this is a simplified example and I can't be bothered to account for when you use your computer for your use case and if you have something like different electric rates depending on the time of day.

If you want to crunch your own numbers, go right ahead.

Don't forget even if your CPU is pulling 50w, even a midrange GPU will be using far more power, with modern midrange cards chocking in well over 200w and easily over 250w, generating far more waste heat than your CPU.
 
How do I justify paying up to $800 for an AMD Ryzen 9 7950x3d?

Two words: electric bill. I don't need a 300+ watt Intel space heater raising my electric bill even higher than it is now.

The money I'll save now and in the future on electricity will justify the high AMD price alone.
The x3d cpu's are not great productivity cpu's so I don't get your post. A locked Intel i7 cost less, is a great all around cpu and doesn't use much juice when gaming.
 

TRENDING THREADS