Yeah, no.It makes perfect sense to pair an x3d with a strong midrange GPU for 1080p gaming
Yeah, no.It makes perfect sense to pair an x3d with a strong midrange GPU for 1080p gaming
I picked up my LGA1700 mobo with the release of Alder Lake with a 12700k back at its release when only the 12900k was faster for gaming. (it replaced my 5775c) Later I bought into the drop in upgrade hype and got a 13900kf when they released and moved my 12700k over to an xbox looking itx in my living room. The 13900kf was enough faster that I sold my 12700k and replaced it with a 13600k. These are what I'm running now. And there aren't any issues as long as the voltages don't go over 1.5v, but mine don't go over 1.35 because of diminishing returns and heat and stuff.Well that happened before lunar lake launched
Everything is at risk of something. Have there been a significant number of 13600k degradation failures reported? I haven't seen any. I think they are rarer than X3D pops.All 13th and 14th gen are at risk, with TDP of 65w or more. The way Intel handled this fiasco, I don't trust that the issue is actually fixed. Thankfully my 12700k doesn't have these issues.
https://www.tomshardware.com/pc-com...says-damage-is-irreversible-no-planned-recall
Yeah it really does. My 5900x limits my 7800xt at 1080p quite often, especially in multiplayer shooters. A 5800x3d drop in replacement would alleviate that bottleneck but it would slow down overflow of physics simulations from my office that I get paid to run on my machine. (I’m an engineer for a small race prep shop/race team.)Yeah, no.
It makes a lot more sense to save $265, get a 13600kf for $175 vs the $440 for a 9800X3D, and put that towards your GPU if you only have the budget for a midrange one.It makes perfect sense to pair an x3d with a strong midrange GPU for 1080p gaming
I had a 13600k degrade, but I ran it heavily OC’d. Took three tries but I eventually got a new one out of them.I picked up my LGA1700 mobo with the release of Alder Lake with a 12700k back at its release when only the 12900k was faster for gaming. (it replaced my 5775c) Later I bought into the drop in upgrade hype and got a 13900kf when they released and moved my 12700k over to an xbox looking itx in my living room. The 13900kf was enough faster that I sold my 12700k and replaced it with a 13600k. These are what I'm running now. And there aren't any issues as long as the voltages don't go over 1.5v, but mine don't go over 1.35 because of diminishing returns and heat and stuff.
And congrats on your 9800X3D purchase. It should work great.
Everything is at risk of something. Have there been a significant number of 13600k degradation failures reported? I haven't seen any. I think they are rarer than X3D pops.
Perhaps I should have said "no more risk than any other CPU as evidenced by there being basically no confirmed degradation at stock incidences in the real world", but no risk seemed close enough because I didn't expect the 100.00000% perfection argument to come up.
That hasn't been my experience even at 1080p with a much faster card than your 7800xt but sure, you do you.Yeah it really does. My 5900x limits my 7800xt at 1080p quite often, especially in multiplayer shooters. A 5800x3d drop in replacement would alleviate that bottleneck but it would slow down overflow of physics simulations from my office that I get paid to run on my machine. (I’m an engineer for a small race prep shop/race team.)
I never said I bought one. I have too many non-gaming tasks to do. I simply said my 5900x bottlenecks me in multiplayer shooters at 1080p. In Factorio I often end up 50-60fps below the numbers reviews give for x3d chips. In Warhammer 3, I’m extremely CPU bottlenecked but I don’t know if it gets big gains from x3d. If I buy anytime soon it’ll be a $300 7900x. That’s a hell of a deal.That hasn't been my experience even at 1080p with a much faster card than your 7800xt but sure, you do you.
So did you have to increase your voltage to maintain your clocks? Did you leave your LLC at stock? I've never seen degradation in person (I have seen burn in with a 5775c though, where it was stable at 4.3ghz when new but dropped to 4.2 ghz after a few months), the closest I've come is seeing someone demonstrate it and it's fix in a video. But I have seen intermittent voltages over 1.6v from overclock+stock LLC settings, which I fixed by adjusting the LLC. Shouldn't get over 1.5v spikes with a 13600k unless you push it to 13900k clocks.I had a 13600k degrade, but I ran it heavily OC’d. Took three tries but I eventually got a new one out of them.
My 12900k is majorly degraded, quite easy to tell since I went from 1.06v to 1.16 for the same 4.9 / 4.0 / 4.0 clocks.So did you have to increase your voltage to maintain your clocks? Did you leave your LLC at stock? I've never seen degradation in person (I have seen burn in with a 5775c though, where it was stable at 4.3ghz when new but dropped to 4.2 ghz after a few months), the closest I've come is seeing someone demonstrate it and it's fix in a video. But I have seen intermittent voltages over 1.6v from overclock+stock LLC settings, which I fixed by adjusting the LLC. Shouldn't get over 1.5v spikes with a 13600k unless you push it to 13900k clocks.
I'm on the other side of gaming. I've been gaming 4k since 2014 (back then with sli 780tis and my Panasonic TC58AX800U that has a DP 1.2 input) and my fastest 4k just goes 120hz. Mostly single player story games. Since most recent CPUs can generally average 120fps the difference to me is the mins, at any time, including the most complex areas where framerate drops hurt your visibility. Above 120fps I do not ever see. Below 60 fps is very apparent and I usually game at 60 fps (usually with interpolation to 120) due to 1. Still using that old Panasonic because I only have space for a 60" TV, 2. GPU limitations, and 3. I'm used to it so faster matters less to my eyes than smoother. Really. I also have a 65" 120hz Samsung in the living room that does freesync and I have compared.It makes perfect sense to pair an x3d with a strong midrange GPU for 1080p gaming
Then slap them properly in an article, without mincing words.I understand your frustration, but newcomers to the PC party just google stuff like "7700x vs 13600" and the top result is usually Userbenchmark. So to those newcomers who might also be on Toms, it's a Public Service Announcement, in a way.
Daniel owens numbers are awful though. I get 3 times his framerates in hogwarts with a 12900k. There is something very off with his numbers.I'm on the other side of gaming. I've been gaming 4k since 2014 (back then with sli 780tis and my Panasonic TC58AX800U that has a DP 1.2 input) and my fastest 4k just goes 120hz. Mostly single player story games. Since most recent CPUs can generally average 120fps the difference to me is the mins, at any time, including the most complex areas where framerate drops hurt your visibility. Above 120fps I do not ever see. Below 60 fps is very apparent and I usually game at 60 fps (usually with interpolation to 120) due to 1. Still using that old Panasonic because I only have space for a 60" TV, 2. GPU limitations, and 3. I'm used to it so faster matters less to my eyes than smoother. Really. I also have a 65" 120hz Samsung in the living room that does freesync and I have compared.
For my uses the 13900kf is real close to the top with the 7800X3D behind. The 9800X3D may be ahead, but will take a lot of money for not a lot of gains. As far as future proofing, Unreal games and bad console ports are a thing and they do the dip. I want my 2yr old 13900kf to last through the next Cyberpunk and Witcher. After that replacing it will be fine with me.
A couple supporting videos:
View: https://youtu.be/2DfGNPiNTuM
I play these kinds of games and lows like this is why I replaced my 5775c. My 13900kf is easily 25% faster in the same Hogwarts scenario btw, and did great with Starfield with no traversal dips but that might have been my Optane.
View: https://youtu.be/Q-1W-VxWgsw
This one goes to extremes but does show weaknesses of X3D and has a good explanation at the end.
There are a lot that single player game on TVs and a lot that game vsync while maxing out graphics. For them mins in difficult games are more important than averages they will never see. RPL is pretty close to the 9800X3D here.
But there are also those who prefer other types of games, have different monitors, or just prefer higher averages more. And the 9800X3D is pretty far ahead on these.
Max settings with RT enabled really drops the framerate with me. Also have to move around like you are playing the game.Daniel owens numbers are awful though. I get 3 times his framerates in hogwarts with a 12900k. There is something very off with his numbers.
I know, im still getting 3 times his framerates with max RT while running around in Hogsmeade.Max settings with RT enabled really drops the framerate with me. Also have to move around like you are playing the game.
and ? the issues effect 13th and 14th gen cpus..The guy has a 13900k bud.
as i have said, considering how much you praise intel, i doubt you have any amd based systems, at all, or would buy amd, why ?? cause it makes every one of your posts, seem moot and contradicts every thing you say..on my 9800x 3d tomorrow,
Ok bud, this just came in today. You can even see the AMD laptop I don't have thereand ? the issues effect 13th and 14th gen cpus..
as i have said, considering how much you praise intel, i doubt you have any amd based systems, at all, or would buy amd, why ?? cause it makes every one of your posts, seem moot and contradicts every thing you say..
yea ok.. i can post a random pic from a google search too... i stil doubt one who praises intelOk bud, this just came in today. You can even see the AMD laptop I don't have there
What CPU is he using in that run?I'm on the other side of gaming. I've been gaming 4k since 2014 (back then with sli 780tis and my Panasonic TC58AX800U that has a DP 1.2 input) and my fastest 4k just goes 120hz. Mostly single player story games. Since most recent CPUs can generally average 120fps the difference to me is the mins, at any time, including the most complex areas where framerate drops hurt your visibility. Above 120fps I do not ever see. Below 60 fps is very apparent and I usually game at 60 fps (usually with interpolation to 120) due to 1. Still using that old Panasonic because I only have space for a 60" TV, 2. GPU limitations, and 3. I'm used to it so faster matters less to my eyes than smoother. Really. I also have a 65" 120hz Samsung in the living room that does freesync and I have compared.
For my uses the 13900kf is real close to the top with the 7800X3D behind. The 9800X3D may be ahead, but will take a lot of money for not a lot of gains. As far as future proofing, Unreal games and bad console ports are a thing and they do the dip. I want my 2yr old 13900kf to last through the next Cyberpunk and Witcher. After that replacing it will be fine with me.
A couple supporting videos:
View: https://youtu.be/2DfGNPiNTuM
I play these kinds of games and lows like this is why I replaced my 5775c. My 13900kf is easily 25% faster in the same Hogwarts scenario btw, and did great with Starfield with no traversal dips but that might have been my Optane.
View: https://youtu.be/Q-1W-VxWgsw
This one goes to extremes but does show weaknesses of X3D and has a good explanation at the end.
There are a lot that single player game on TVs and a lot that game vsync while maxing out graphics. For them mins in difficult games are more important than averages they will never see. RPL is pretty close to the 9800X3D here.
But there are also those who prefer other types of games, have different monitors, or just prefer higher averages more. And the 9800X3D is pretty far ahead on these.
then i guess that means ALL of your posts praising intel, are just trolling, and you are a hypocrite.. and as such, your all of your posts praising intel, should be taking as FUD, gotchaOh damn, you got me.
Yeap, you got me again. Damnthen i guess that means ALL of your posts praising intel, are just trolling, and you are a hypocrite.. and as such, your all of your posts praising intel, should be taking as FUD, gotcha
7800X3D. But you are also getting way more frames than I get with a 13900kf as well.What CPU is he using in that run?
Here is mine - settings at the start of the video. He must be doing something wrong im telling you
View: https://www.youtube.com/watch?v=lxJWGdFZyVI
I can tell you my secret, it's mostly the ram latency7800X3D. But you are also getting way more frames than I get with a 13900kf as well.