Question Should i upgrade? or is my current cpu not set up right?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

daylightriot

Commendable
Mar 4, 2018
89
5
1,545
I currently have a ryzen 2600 @ 4.2 ghz with 3600mhz ram in dual channel. It bottlenecks my 980ti quite a bit. i recently played darksiders 3 and my gpu was running @ 70% max during gameplay @1080p ultra no vsync.
Does anyone have experience with this issue? according to all the benchmarks out there my cpu should barely bottleneck this gpu and when it does it should be between 5-10%. if i cant find a way to resolve this then i'll probably buy an 8600k and a z390 mobo soon. since switching to AMD i always seem to be waiting for that cpu with better ipc and higher clocks.
 

daylightriot

Commendable
Mar 4, 2018
89
5
1,545
Thanks for posting your findings.

I do want to point out if you look at basically any gaming test Ryzen processors always lag behind Intel in terms of pure FPS performance. If you want the maximum FPS from whatever system you are building, Ryzen is not your first choice.

The advantage with Ryzen is that the performance drop off is minimal, and still more than enough to saturate todays gaming GPUs giving you well over 60 fps at 1080p in any game, and not a massive percentage behind Intel, all the while being in many cases significantly cheaper for the processor and beating Intel at certain other tasks (whether or not those are important to you is personal preference). At higher resolutions the Ryzen processors are side by side with intel as the GPU becomes the limiting factor.

This is one of those times that yes there is a "bottleneck" in the purest sense of the word. But the reality is, is it affecting your gaming? Not really, no. Do your games play smoothly and at a high fps, without performance issues? Yes they do. Bottlenecks aren't going to hurt your equipment, and in many cases unless you are playing competitively, does it really matter if you get only 134 fps vs an Intel processor at 144 fps? It doesn't. Is it worth $100-$300 more for the processor? Certainly not.

i'm seeing a bigger disparity of fps than what you're claiming there. Im having bottlenecks of 20-30% @1080p in a lot of AAA titles.

to put that into perspectiive, a 30% bottleneck in a game where you're getting 100 fps average means i could be getting 130 fps if my cpu wasnt bottlenecking me. Bear in mind, i game on a 144hz monitor. I want the highest fps possible.

It's also stopping me from upgrading my gpu as the 2600 will just bottleneck the better gpu down to the same levels of performance, making the upgrade a waste of money.

As a pc gamer, I want a cpu that i can keep for a few years with enough single core headroom on it that i dont have to buy a new cpu/mobo/gpu everytime i want better performance.

With this in mind, does anyone have any hardware recommendations? I wont be upgrading until after ive checked out ryzen 7nm but I would like to get a bit more clued up on whats good price performance wise on intels platform.
 

Rogue Leader

It's a trap!
Moderator
i'm seeing a bigger disparity of fps than what you're claiming there. Im having bottlenecks of 20-30% @1080p in a lot of AAA titles.

to put that into perspectiive, a 30% bottleneck in a game where you're getting 100 fps average means i could be getting 130 fps if my cpu wasnt bottlenecking me. Bear in mind, i game on a 144hz monitor. I want the highest fps possible.

It's also stopping me from upgrading my gpu as the 2600 will just bottleneck the better gpu down to the same levels of performance, making the upgrade a waste of money.

As a pc gamer, I want a cpu that i can keep for a few years with enough single core headroom on it that i dont have to buy a new cpu/mobo/gpu everytime i want better performance.

With this in mind, does anyone have any hardware recommendations? I wont be upgrading until after ive checked out ryzen 7nm but I would like to get a bit more clued up on whats good price performance wise on intels platform.

How are you calculating your bottleneck? Where are you getting 20-30%? Your theory doesn't hold water. You're running a 2600, which is midline, not the fastest Ryzen 2700X, and you're also comparing it to the performance of the fastest CPU you can get.

Take a look at these performance tests:


The performance difference there is 18% at 1080p, and thats with a 2080ti. As you can see you will have WAY more fps than running with your 980ti.

Look here

https://www.techspot.com/review/1614-ryzen-2600/page3.html

Removing the GPU as the bottleneck from the equation You're talking about 7-10 fps difference between a Ryzen 5 2600 and an equivalent i5-8600k.

I think your expectations are far beyond what you purchased, and if you want the MAX FPS you should have been buying 2700X if you were going AMD, or an 8700K or 9700k from intel. Anything less is going to hold back your framerates, but its nowhere near what you think it is.
 

daylightriot

Commendable
Mar 4, 2018
89
5
1,545
How are you calculating your bottleneck? Where are you getting 20-30%? Your theory doesn't hold water. You're running a 2600, which is midline, not the fastest Ryzen 2700X, and you're also comparing it to the performance of the fastest CPU you can get.

Take a look at these performance tests:


The performance difference there is 18% at 1080p, and thats with a 2080ti. As you can see you will have WAY more fps than running with your 980ti.

Look here

https://www.techspot.com/review/1614-ryzen-2600/page3.html

Removing the GPU as the bottleneck from the equation You're talking about 7-10 fps difference between a Ryzen 5 2600 and an equivalent i5-8600k.

I think your expectations are far beyond what you purchased, and if you want the MAX FPS you should have been buying 2700X if you were going AMD, or an 8700K or 9700k from intel. Anything less is going to hold back your framerates, but its nowhere near what you think it is.
If you look at the far cry 5 benchmark, you can see a cpu bottleneck. Farcry 5 is very cpu intensive (single core. Speed) whereas as most of the other games on this list are not. And before you mention it, ashes of the benchmark is optimised for ryzen. The one surprise on that List for me is ac origins. I was getting only. 80% gpu utilisation on that title.

As far as how I calculate the bottleneck, I got the numbers from personal experience using it. And a load of recent tests in cpu heavy games. I'll upload a few videos of different cpu heavy games so you can see what. I mean. This is why I asked advice on this forum. According to online benchmarks by popular youtubers, I shouldn't have this much of a bottleneck.
 

Karadjgne

Titan
Ambassador
Your understanding of fps and bottlenecks is somewhat wrong. Fps has exactly nothing to do with the gpu. Fps is set solely by the game code and cpu. The cpu always works at 100% ability. If it's 3.2GHz or 4.9GHz, that's what it works at. If the game code says it needs upto 4 threads, it doesn't matter if you have a 16 thread capable cpu, the game will use upto 4. That's all. The cpu will pre-render that game code as fast as it can do. If that's 60 or 100 or 500 frames, that's what it shoves at the gpu. Now it's upto the gpu to paint that picture, at the detail settings and resolution specified. If you get 50 or 90 or 400, it simply means the gpu cannot paint the picture fast as the cpu can ship it. If you get 60 or 100 or 500 then it means the gpu can paint the picture upto the limit of frames the cpu is sending.

At NO time does a gpu ever bottleneck a cpu. The only bottleneck a cpu ever sees is from anything behind it (ram, storage, game-code etc), not from stuff in front of it like a gpu.

I don't care what your videos say, or benchmarks claim, if you are getting 100fps in game it's one of only 2 things. Either the cpu simply cannot give more than 100fps (due to drivers, game-code, storage hangup, slow ram or its simply maxed out) or the gpu cannot put the picture up on the screen any faster (due to drivers, detail settings, resolution, any combo of these). That's it.

You'll know what is the cause by lowering or raising detail settings. If fps changes, it's the gpu problem. The gpu is either having it easier or harder to paint the picture. If fps doesn't really change much at all, then the gpu is not the issue, the cpu is. It's hit a max ability for some reason and the gpu is painting every single frame the cpu sends. All 100 of them at any detail setting.

First figure out where the supposed issue is, then search for why. Doing it the other way round just runs you around in circles and gets you nowhere.
 
Last edited:

Rogue Leader

It's a trap!
Moderator
If you look at the far cry 5 benchmark, you can see a cpu bottleneck. Farcry 5 is very cpu intensive (single core. Speed) whereas as most of the other games on this list are not. And before you mention it, ashes of the benchmark is optimised for ryzen. The one surprise on that List for me is ac origins. I was getting only. 80% gpu utilisation on that title.

As far as how I calculate the bottleneck, I got the numbers from personal experience using it. And a load of recent tests in cpu heavy games. I'll upload a few videos of different cpu heavy games so you can see what. I mean. This is why I asked advice on this forum. According to online benchmarks by popular youtubers, I shouldn't have this much of a bottleneck.

I'm not arguing that there isn't a "bottleneck" did you even read what I wrote? You came into this expecting max framerate from a midline Ryzen processor. I don't know where you got any impression that it would do better than the equivalent Intel, it won't.

The numbers I presented don't lie. You aren't calculating it correctly, you specifically said you have a 20-30% bottleneck with the expectation of getting 30 fps more (over something giving you 100fps), thats just not true. If you told me you were getting 160fps in a game and the gpu was capable of 30fps more then it could be true. 1 fps does not scale to 1 percentage point, nor does only 70% GPU utilization equate to a 30% bottleneck.

You mentioned Darksiders 3:


at 1080p there is a 22% difference in framerate when its unlocked. Still however you're at 165fps.

If you're not getting that performance then maybe there is something wrong with your system, your BIOS settings or something else. But your logic and numbers don't support your theory.
 

InvalidError

Titan
Moderator
Wait up. So this whole post is because you are concerned because the gpu was only running at 70%? You are concerned that the cpu is slowing down the gpu and only allowing it to run at 70% of max? Is that about correct?
Some people have an unhealthy obsession with bottlenecks even when they don't affect their experience in any meaningful way. I don't give a damn what my CPU and GPU usage are as long as I have consistently playable frame rates, I'll worry about upgrades when I can no longer get that.
 
  • Like
Reactions: Rogue Leader
Update

After work today I set up dual boot and installed windows 10 pro. I updated W10 pro installing all updates. A quick check showed a number of ryzen updates installed. I then used the same settings in RE2, the same gpu driver, and recorded a gpu utilisation video with shadow play on both OS's.

Video 1 windows 10 pro
View: https://youtu.be/aKfhFmay-jY


Video 2 windows ltsb
View: https://youtu.be/sFe23Ddl9yc


as you can see, the so called ryzen and gaming updates do next to nothing.

I believe i have done everything possible at this point to try and remove this bottleneck at a software level. this leaves a hardware upgrade the only feasible option.

I would like to thank all of you for your contributions in helping with this issue. I'll wait for next gen ryzen, watch every hardware review benchmark of it i can as well as check single thread scores, if amd still lags behind intel I will change platform.

I don't understand the reasoning for this test?? You can get ANY cpu to bottleneck when you turn down the game settings and let the GPU run at 200+fps. That's the way code works, you'll eventually find that loop that takes the longest to run no matter which processor, and the calculations take longer then the next frame time. With my VEGA 64, I was pegged at 100% gpu usage and still getting 90fps, which still looks amazing with freesync. For first person shooters, that's generally where I like to target my frame rates, for single player games I target 60fps.

If you crank up your game settings to normal and the GPU is pegged at 100%, then you're not CPU bottlenecked. Who spends this much money on a PC to game at 720p?

You really need to buy a Gsync or Freesync monitor. Nvidia is now supporting freesync on some monitors. 90fps is all you needs for crystal clear, smooth and quick gameplay with freesync and negligible input lag. My monitor is also 144hz, but I can't tell any difference once I get past 90fps.
 
  • Like
Reactions: Rogue Leader

Karadjgne

Titan
Ambassador
And here's the kicker, he's only getting 80% gpu utilization in ac origins. Why does it sound like he's upset that there's a 20% bottleneck obviously since he's not getting 100% utilization.

Seems op's misconceptions are really confusing him to the point of frustration over imagined concepts.
 
  • Like
Reactions: Rogue Leader

daylightriot

Commendable
Mar 4, 2018
89
5
1,545
And here's the kicker, he's only getting 80% gpu utilization in ac origins. Why does it sound like he's upset that there's a 20% bottleneck obviously since he's not getting 100% utilization.

Seems op's misconceptions are really confusing him to the point of frustration over imagined concepts.

<removed by moderator> Why don't you explain it to me? As far as I'm aware, if your not getting 100% gpu utilisation then somethings holding it back. Am I wrong?
 
Last edited by a moderator:

Karadjgne

Titan
Ambassador
Yes. And I did explain.

Its the game code. It decides what is used or not. When all that gets shipped to the gpu, the gpu has to paint the frames on screen. It does so at 100% ability. But you have to figure that there's really 2 things goin on, not just the 1.

Imagine a screen that's all 1 color. The cpu ships it's pre-rendered frames to the gpu, it might be 100fps. For a single color screen like that you'll get all 100fps as it's easy for the gpu to paint. Might take 20% of the gpu ability to do so. Same thing with a complex set of frames with explosion detailing, which is massive in partial movement. You'll still get the 100fps input from the cpu, but the gpu takes longer to paint each picture due to massive frame pixel counts using different colors and motions etc, so fps drops because there's less frames being output. But you might only see the gpu using 60% of its ability to get that output. That's not a 40% bottleneck, frames per second to a gpu is dependent on the sheer amount of variation.

Imagine you swinging a sledge hammer, driving a post into the ground. Every swing, you put forth 100% effort, but might get 5 swings a minute. Get a bigger hammer, that moves to 4 swings a minute, you still put out 100% effort, but even that hammer is only 50% of what you can actually lift. Get a smaller hammer, you get 6 swings per minute, 100% effort, 45% of your ability.

A gpu paints that picture as fast as it can, 5 swings. It always does so at 100% of its clock/memory speeds, 100% effort, but that might only be using 50% of its ability. A bigger hammer weighing more.

If a gpu is utilizing 100%, it's at the limit of how much it can lift, not how fast it swings. A gpu at 50% is still painting the picture as fast as possible, but there's so many details to paint that each frame physically takes longer, so frames per second is smaller. Time between swings.
 

daylightriot

Commendable
Mar 4, 2018
89
5
1,545
I don't understand the reasoning for this test?? You can get ANY cpu to bottleneck when you turn down the game settings and let the GPU run at 200+fps. That's the way code works, you'll eventually find that loop that takes the longest to run no matter which processor, and the calculations take longer then the next frame time. With my VEGA 64, I was pegged at 100% gpu usage and still getting 90fps, which still looks amazing with freesync. For first person shooters, that's generally where I like to target my frame rates, for single player games I target 60fps.

If you crank up your game settings to normal and the GPU is pegged at 100%, then you're not CPU bottlenecked. Who spends this much money on a PC to game at 720p?

You really need to buy a Gsync or Freesync monitor. Nvidia is now supporting freesync on some monitors. 90fps is all you needs for crystal clear, smooth and quick gameplay with freesync and negligible input lag. My monitor is also 144hz, but I can't tell any difference once I get past 90fps.

I have a freesync monitor. However, I don't think freesync is coming to the 900 series.

The point of the test is to find a cpu bottleneck. According to my findings online, if you have zero to minimal cpu bottlenecking at 720p then you'll have no bottlenecking at 1080p. However, I don't know the validity of these claims.

I have learnt quite a bit about this particular issue talking to you lot and doing some online research. I also think this thread would be helpful for other ryzen users.

Thank you all for your contributions, in particular, gggplaya and rogue leader.
 

Rogue Leader

It's a trap!
Moderator
I have a freesync monitor. However, I don't think freesync is coming to the 900 series.

The point of the test is to find a cpu bottleneck. According to my findings online, if you have zero to minimal cpu bottlenecking at 720p then you'll have no bottlenecking at 1080p. However, I don't know the validity of these claims.

I have learnt quite a bit about this particular issue talking to you lot and doing some online research. I also think this thread would be helpful for other ryzen users.

Thank you all for your contributions, in particular, gggplaya and rogue leader.

Gsync is now supported on Freesync monitors. Some work great, others not so much, but its included in the latest driver package.

I think you got where we are going, the truth is everything is a bottleneck somewhere, unless you try to play something so far beyond the systems capabilities that all things are a bottleneck all the time (like trying to play Star Citizen on an old i3 and a GTX 750ti). The question is finding a balance between budget, desired framerate, and playability.
 
  • Like
Reactions: daylightriot
Yes. And I did explain.

Its the game code. It decides what is used or not. When all that gets shipped to the gpu, the gpu has to paint the frames on screen. It does so at 100% ability. But you have to figure that there's really 2 things goin on, not just the 1.

Imagine a screen that's all 1 color. The cpu ships it's pre-rendered frames to the gpu, it might be 100fps. For a single color screen like that you'll get all 100fps as it's easy for the gpu to paint. Might take 20% of the gpu ability to do so. Same thing with a complex set of frames with explosion detailing, which is massive in partial movement. You'll still get the 100fps input from the cpu, but the gpu takes longer to paint each picture due to massive frame pixel counts using different colors and motions etc, so fps drops because there's less frames being output. But you might only see the gpu using 60% of its ability to get that output. That's not a 40% bottleneck, frames per second to a gpu is dependent on the sheer amount of variation.

Imagine you swinging a sledge hammer, driving a post into the ground. Every swing, you put forth 100% effort, but might get 5 swings a minute. Get a bigger hammer, that moves to 4 swings a minute, you still put out 100% effort, but even that hammer is only 50% of what you can actually lift. Get a smaller hammer, you get 6 swings per minute, 100% effort, 45% of your ability.

A gpu paints that picture as fast as it can, 5 swings. It always does so at 100% of its clock/memory speeds, 100% effort, but that might only be using 50% of its ability. A bigger hammer weighing more.

If a gpu is utilizing 100%, it's at the limit of how much it can lift, not how fast it swings. A gpu at 50% is still painting the picture as fast as possible, but there's so many details to paint that each frame physically takes longer, so frames per second is smaller. Time between swings.


A major problem for ryzen series is the CCX modules and windows 10 scheduler. Windows 10 loves to move threads on different cores while ryzen is having performance issues doing that. For example the cache is located on a core on ccx module 1 and if the thread iis moved to a core in ccx module 2 then the cache have to be moved too causing performance issues. Intel on the otherhand has a big common cache which can be accessed by all cores at any time and thats why it is not affected by this. Proof.
View: https://www.youtube.com/watch?v=BORHnYLLgyY
 
  • Like
Reactions: daylightriot

InvalidError

Titan
Moderator
As far as I'm aware, if your not getting 100% gpu utilisation then somethings holding it back. Am I wrong?
Unless you are experiencing significant performance issues, not reaching 100% GPU usage is perfectly fine: that's your headroom for the GPU to handle more demanding frames without losing much FPS. When the GPU is at 100% all of the time, then you have more jumpy GPU-limited frame rates.
 

Rogue Leader

It's a trap!
Moderator
A major problem for ryzen series is the CCX modules and windows 10 scheduler. Windows 10 loves to move threads on different cores while ryzen is having performance issues doing that. For example the cache is located on a core on ccx module 1 and if the thread iis moved to a core in ccx module 2 then the cache have to be moved too causing performance issues. Intel on the otherhand has a big common cache which can be accessed by all cores at any time and thats why it is not affected by this. Proof.

Right and this is exactly why having at least 3000mhz ram on a Ryzen system is important. Any slower exacerbates this issue.
 
I have a freesync monitor. However, I don't think freesync is coming to the 900 series.

The point of the test is to find a cpu bottleneck. According to my findings online, if you have zero to minimal cpu bottlenecking at 720p then you'll have no bottlenecking at 1080p. However, I don't know the validity of these claims.

I have learnt quite a bit about this particular issue talking to you lot and doing some online research. I also think this thread would be helpful for other ryzen users.

Thank you all for your contributions, in particular, gggplaya and rogue leader.

I don't think it's a reasonable test dropping to 720p. That's how reviewers on youtube review CPU's, by dropping down to 720p in order to eliminate the GPU as the bottleneck. The more valid test when not trying to unbottleneck your GPU is to just drop the frequency of your CPU by 400mhz. If you see a corresponding drop in FPS, then you do have a CPU bottleneck. If there's no drop in fps, then you're GPU limited.

Might be time to get a new GPU soon, that 6GB of ram is not enough for the newest games today using really nice HD textures. I really doubt Nvidia will give retroactive Adaptive Sync support to Maxwell GPU's but the good news is you can pick up a Vega 56 used for dirt cheap these days. It does require an undervolting, but that takes 10 seconds, I can walk you through it. I bought a used Vega 64 on ebay for <$300.

Or you can upgrade to Pascal GTX 1080, which are going on ebay for about $340 and has 8GB of ram. Nvidia will be a guess and check affair when it comes to Freesync, unless your monitor is on the approved list. I'm sure there might be users on reddit or somewhere creating a list of freesync compatible monitors as well.
 

Rogue Leader

It's a trap!
Moderator
Might be time to get a new GPU soon, that 6GB of ram is not enough for the newest games today using really nice HD textures. I really doubt Nvidia will give retroactive Adaptive Sync support to Maxwell GPU's but the good news is you can pick up a Vega 56 used for dirt cheap these days. It does require an undervolting, but that takes 10 seconds, I can walk you through it. I bought a used Vega 64 on ebay for <$300.

Or you can upgrade to Pascal GTX 1080, which are going on ebay for about $340 and has 8GB of ram. Nvidia will be a guess and check affair when it comes to Freesync, unless your monitor is on the approved list. I'm sure there might be users on reddit or somewhere creating a list of freesync compatible monitors as well.

Oh it doesn't work on Maxwell, did not know that, thought it was all around.

All Freesync monitors now work with it, however some of them it is very poor performance, others it works great. One of the other Mods is using it on a monitor not on the list says its almost perfect.
 

daylightriot

Commendable
Mar 4, 2018
89
5
1,545
I don't think it's a reasonable test dropping to 720p. That's how reviewers on youtube review CPU's, by dropping down to 720p in order to eliminate the GPU as the bottleneck. The more valid test when not trying to unbottleneck your GPU is to just drop the frequency of your CPU by 400mhz. If you see a corresponding drop in FPS, then you do have a CPU bottleneck. If there's no drop in fps, then you're GPU limited.

Might be time to get a new GPU soon, that 6GB of ram is not enough for the newest games today using really nice HD textures. I really doubt Nvidia will give retroactive Adaptive Sync support to Maxwell GPU's but the good news is you can pick up a Vega 56 used for dirt cheap these days. It does require an undervolting, but that takes 10 seconds, I can walk you through it. I bought a used Vega 64 on ebay for <$300.

Or you can upgrade to Pascal GTX 1080, which are going on ebay for about $340 and has 8GB of ram. Nvidia will be a guess and check affair when it comes to Freesync, unless your monitor is on the approved list. I'm sure there might be users on reddit or somewhere creating a list of freesync compatible monitors as well.

before the 980ti i had an rx480 8gb so I do have a bit of experience with undervolting :D I also have an accelero 4 cooler lying around somewhere that i used on my old gpu. I sold it during the mining craze, used the money for a 980ti, a papa johns and a pair of jeans :D

would a vega 64 be a valid option or are they still way overpriced? I dont mind buying used as i have a bit of knowledge on testing gpu's for stability.
 

Rogue Leader

It's a trap!
Moderator
before the 980ti i had an rx480 8gb so I do have a bit of experience with undervolting :D I also have an accelero 4 cooler lying around somewhere that i used on my old gpu. I sold it during the mining craze, used the money for a 980ti, a papa johns and a pair of jeans :D

would a vega 64 be a valid option or are they still way overpriced? I dont mind buying used as i have a bit of knowledge on testing gpu's for stability.

If you have to pay near original price no.

But a Vega 64 at $399 ish is a great deal, it beats the GTX 1080 in most games, and keeps up with the RTX 2070. If its $499 or more get an RTX 2070.
 
  • Like
Reactions: daylightriot

daylightriot

Commendable
Mar 4, 2018
89
5
1,545
Oh it doesn't work on Maxwell, did not know that, thought it was all around.

All Freesync monitors now work with it, however some of them it is very poor performance, others it works great. One of the other Mods is using it on a monitor not on the list says its almost perfect.
If you have to pay near original price no.

But a Vega 64 at $399 ish is a great deal, it beats the GTX 1080 in most games, and keeps up with the RTX 2070. If its $499 or more get an RTX 2070.

would that pricing be for a reference card? or would i be able to get a better aftermarket card for similar pricing?

also, what would your thoughts be on the liquid cooled version? worth buying or is it a bit of a gimmick?
 

Rogue Leader

It's a trap!
Moderator
would that pricing be for a reference card? or would i be able to get a better aftermarket card for similar pricing?

also, what would your thoughts be on the liquid cooled version? worth buying or is it a bit of a gimmick?

I have the Vega 64 Liquid Cooled. It looks sweet in my system (click on the link in my sig). That said the pump is kind of loud, but even stock mine runs sustainably faster than any other non-LC Vega 64 even aftermarket ones. I got it launch day and paid $699 for it. I don't regret it, but remember I've owned it since Sept 2017.

At this point though my pricing point still stands, Today i would not pay over $399, maybe $450 tops for any Vega 64 (reference or aftermarket). Not when you can get an RTX 2070 for around $499 or less. The RTX 2070 consumes less power and performs the same or better in most titles.

This is a better deal (for example)

https://smile.amazon.com/EVGA-RTX-2...words=Vega+64&qid=1552494247&s=gateway&sr=8-2

Also keep in mind I don't know what PSU you have but if you have anything less than a GOOD 750w one, you will regret buying a Vega 64.
 

daylightriot

Commendable
Mar 4, 2018
89
5
1,545
I have the Vega 64 Liquid Cooled. It looks sweet in my system (click on the link in my sig). That said the pump is kind of loud, but even stock mine runs sustainably faster than any other non-LC Vega 64 even aftermarket ones. I got it launch day and paid $699 for it. I don't regret it, but remember I've owned it since Sept 2017.

At this point though my pricing point still stands, Today i would not pay over $399, maybe $450 tops for any Vega 64 (reference or aftermarket). Not when you can get an RTX 2070 for around $499 or less. The RTX 2070 consumes less power and performs the same or better in most titles.

This is a better deal (for example)

https://smile.amazon.com/EVGA-RTX-2...words=Vega+64&qid=1552494247&s=gateway&sr=8-2

Also keep in mind I don't know what PSU you have but if you have anything less than a GOOD 750w one, you will regret buying a Vega 64.

I have a 650watt gold rated psu from corsair. I also have a killawatt plug reader for measuring total system power. I had a quick look and found a test over at anadtech showing a total combined system wattage of 470watts on furmark with a 7820x as the cpu. bear in mind this is with an air cooled vega 64.

https://www.anandtech.com/show/11717/the-amd-radeon-rx-vega-64-and-56-review/7

so worst case scenario i should be looking at the air versions? I'd much rather use an amd gpu if i can because i find the adrenalin software much better than the dated nvidia ui. Are there any older amd cards that perform similar to vega 64? i feel like vega 56 wouldnt be too much of an upgrade from my 980ti.
 

Rogue Leader

It's a trap!
Moderator
I have a 650watt gold rated psu from corsair. I also have a killawatt plug reader for measuring total system power. I had a quick look and found a test over at anadtech showing a total combined system wattage of 470watts on furmark with a 7820x as the cpu. bear in mind this is with an air cooled vega 64.

https://www.anandtech.com/show/11717/the-amd-radeon-rx-vega-64-and-56-review/7

so worst case scenario i should be looking at the air versions? I'd much rather use an amd gpu if i can because i find the adrenalin software much better than the dated nvidia ui. Are there any older amd cards that perform similar to vega 64? i feel like vega 56 wouldnt be too much of an upgrade from my 980ti.

I had a Seasonic Prime Titanium 650w. Playing certain games I would hit OCP and the system would shut off. That Corsair doesn't stand a chance against it. I sold it and got the same PSU in 750w. Mind you I do have an 1800X so my system can draw more power, but you're not that far off, especially if you upgrade your CPU. Also in that text they didn't turn on HBCC which makes a significant performance difference in many games. It also causes the card to use significantly more power.

The Vega 64 is faster than any other AMD GPU except for the new Radeon VII. The problem with the Radeon VII is its $700-$800. If you have that kinda budget, sure, its competitive with the RTX 2080 at a similar price. I agree the Vega 56 isn't a huge upgrade over the 980ti, you want more if you're going to spend that kinda money. Air cooled cards do use less power because they can't crank as high as the LC one, also you can avoid using the high performance profile to keep power use down, but if you're doing that its kind of a waste to spend $400-$500 on a GPU that you're limiting.
 

daylightriot

Commendable
Mar 4, 2018
89
5
1,545
I had a Seasonic Prime Titanium 650w. Playing certain games I would hit OCP and the system would shut off. That Corsair doesn't stand a chance against it. I sold it and got the same PSU in 750w. Mind you I do have an 1800X so my system can draw more power, but you're not that far off, especially if you upgrade your CPU. Also in that text they didn't turn on HBCC which makes a significant performance difference in many games. It also causes the card to use significantly more power.

The Vega 64 is faster than any other AMD GPU except for the new Radeon VII. The problem with the Radeon VII is its $700-$800. If you have that kinda budget, sure, its competitive with the RTX 2080 at a similar price. I agree the Vega 56 isn't a huge upgrade over the 980ti, you want more if you're going to spend that kinda money. Air cooled cards do use less power because they can't crank as high as the LC one, also you can avoid using the high performance profile to keep power use down, but if you're doing that its kind of a waste to spend $400-$500 on a GPU that you're limiting.

agreed. I'd want every bit of horsepower that bad mamajama can use. what is HBCC? some sort of performance setting? also, do you have other high watt parts in your system, increasing the power draw? i know the question is a bit redundant but i'd like to avoid upgrading the psu if possible.

I have, 2 x 120gb ssd's (Sata), 1 x 2tb 7200rpm HDD, 1 x 1tb 7200rpm HDD, 1x wfi card pcie. also my max power draw with my current setup is 384watts whilst gaming (cpu oc'd 4.2 HT on, gpu @ stock frequency of 1.39ghz).

could i get away with my current psu?
 

Rogue Leader

It's a trap!
Moderator
agreed. I'd want every bit of horsepower that bad mamajama can use. what is HBCC? some sort of performance setting? also, do you have other high watt parts in your system, increasing the power draw? i know the question is a bit redundant but i'd like to avoid upgrading the psu if possible.

I have, 2 x 120gb ssd's (Sata), 1 x 2tb 7200rpm HDD, 1 x 1tb 7200rpm HDD, 1x wfi card pcie. also my max power draw with my current setup is 384watts whilst gaming (cpu oc'd 4.2 HT on, gpu @ stock frequency of 1.39ghz).

could i get away with my current psu?

This explains HBCC better than I can


I have less drives than you I don't have anything else pulling down a lot more power. I have 1 m.2 PCie SSD, 1 sata SSD and 1 7200rpm hdd. I do have a lot of fans but they don't draw much. The GPU alone gaming will average 330w power draw, and spike to 380w, with HBCC on.

Could you get away with it? Maybe, but if you start getting crashes, as in the system will straight out shut off, you know what the problem is. I anticipate you will have problems.
 

TRENDING THREADS