AMD's Future Chips & SoC's: News, Info & Rumours.

Page 103 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

jdwii

Splendid
To be frank it's already happening 4C/4T is not enough to maximize frame rates in a modern $200+ GPU.

It's not that it won't work but you will not be getting the most out of the game FPS wise and if you have a beefy GPU you will not be seeing it at 100% usage in many games.

In 2014 I could easily recommend a 4 core processor to any gamer at any budget but not anymore. Think about it in 2019 a 8 core 9900K is the best option for gamers even a 9700K loses when OC to 5ghz in some games and that is a 8C/8T part.

I say it again 6 cores will not be the ultimate gaming CPU for long if it even is anymore in today's day in age. With next gen consoles using 8 cores and 16 threads i even think we will see quicker adoption rates to 8 cores then we did to 4 to 6 cores.


https://postimg.cc/hfgqyz7F

When taking a look at the survey we see 4 cores down by 1.21% and 6 cores are at +0.36% and 8 cores is at +0.25%. Already kind of close to seeing people jump to 8 cores on steam as 6.
 

InvalidError

Titan
Moderator
To be frank it's already happening 4C/4T is not enough to maximize frame rates in a modern $200+ GPU.

It's not that it won't work but you will not be getting the most out of the game FPS wise and if you have a beefy GPU you will not be seeing it at 100% usage in many games.
100% GPU usage does not matter, what does is maintaining playable frame rates. GPU load varies massively depending on what is on screen and if you have 100% GPU load not much beyond 60fps on a relatively lightweight scene, you will encounter jarring frame rate dips in heavier scenes. Also, if you play with vsync enabled, 100% GPU load only happens when frame rate drops below vsync unless you enable triple-buffering at which point you are pissing away power rendering frames that never get to screen.

Yes, there is adaptive sync these days but how many average people care enough about it to ditch their existing monitors and TVs that typically have a useful life of 10+ years? My monitors are 8-15 years old and still working great, no reason to replace any of them yet. My newest addition is a $600 Samsung 7100 TV and I'm not planning to replace it until OLED or uLED reaches price parity with LCDs.
 
It's not a matter of games alienating their user base with increased requirements.

They can perfectly make games escale using better engines. Problem is, the more features you want in your games, you're kind of screwed with the amount of processing power you need anyway. You can't make the train wait eternally for people to get on. It has to depart at some point!

That's just how it is.

Cheers!
 

InvalidError

Titan
Moderator
You can't make the train wait eternally for people to get on. It has to depart at some point!
If not enough people are interested in taking a particular train route, the whole route eventually gets cancelled and the rail decommissioned. Around 20 train companies have gone bankrupt in the USA over the past century.

If game developers push their luck too much, they disproportionately increase their support overheads for managing customer complaints, lose sales due to bad customer reviews, lose some more future sales from customers deciding not to do any more business with the company, lose future sales due to souring a previously successful franchise and when that gets bad enough, future games get cancelled. (Though a lot of franchise spoiling is already happening simply from developers/publishers screwing up franchises with abusive microtransactions, putting things that should be in the base game into DLCs instead, shipping games that are barely viable, etc.)
 

goldstone77

Distinguished
Aug 22, 2012
2,245
14
19,965
Even when I bought my 2500K brand new with a ~$200 card ~7 years ago I wasn't able to play games at 1080p on the Ultra quality resolution flawlessly. Dialing the settings down improved game play. Nothing has changed today. People using lesser hardware must dial down the settings to make some games playable. If you want the best experience you have to buy the right hardware for the game you want to play at the resolution and FPS you want to play it at. People with the best hardware will be able to play games at the highest resolutions with the best quality settings. People with lesser hardware will still be able to play these games, but will have to turn down the settings/resolution. Lots of people play esports on cheap laptops with the settings and resolutions dialed all the way down.
 
Main issue with quads is the lows. The averages are ok but stutter can make a game less fun.

For example, i get 80+ fps with my RX580 8gb and r3 1200 in gtav, however when explosions hit the screen or you are driving really fast, you will feel some stutters. I cpuld also blame some uf this stutter on my 8gb ram, but overclocking mem amd cpu helps a lot. I guess this 580 would have beem over $200 one day?
 
Last edited:
  • Like
Reactions: goldstone77

boju

Titan
Ambassador
Background services can also be dialed back freeing up more resources so the cpu can do more in the games you play.

If there's a need for a band-aid fix there are programs that can help like Winaero.

8GB ram was considered more than enough once. In big games today with the amount of IO activity on drives even with 16GB, the recommended ram bubble is shifting toward 32GB reasonably quickly.
 
If not enough people are interested in taking a particular train route, the whole route eventually gets cancelled and the rail decommissioned. Around 20 train companies have gone bankrupt in the USA over the past century.

If game developers push their luck too much, they disproportionately increase their support overheads for managing customer complaints, lose sales due to bad customer reviews, lose some more future sales from customers deciding not to do any more business with the company, lose future sales due to souring a previously successful franchise and when that gets bad enough, future games get cancelled. (Though a lot of franchise spoiling is already happening simply from developers/publishers screwing up franchises with abusive microtransactions, putting things that should be in the base game into DLCs instead, shipping games that are barely viable, etc.)
False equivalency, though. So you mean that the PC (as a standard) will just cease to exist because people won't be buying new ones? They'll keep buying the bottom of the barrel so no new CPUs are developed?

I don't even want to make a parallel with Freight Trains in the mix, now :p

They need to strike a balance, but it is in everyone's best interest they keep pushing technology. Intel was happy with the status Quo; look what happened in 5+ years with PCs.

Cheers!
 

JaSoN_cRuZe

Honorable
Mar 5, 2017
457
41
10,890
The transition to newer hardware would have been more smoother if Intel had provided more cores to the consumer market, after the launch of Ryzen we are seeing these tremendous value to the processors which made Intel to up their game , otherwise we would still be at 6C6T MAX and we would recommend 9600K to all gamer's with Ryzen out of the picture same as we did during 3570K recommended over 3770K for gaming.
 

InvalidError

Titan
Moderator
False equivalency, though. So you mean that the PC (as a standard) will just cease to exist because people won't be buying new ones? They'll keep buying the bottom of the barrel so no new CPUs are developed?
No, they'll keep buying the slowly rising bottom of the barrel every 5-10 years. That's why most games' minimum requirements are hardware from 5+ years ago.

At the end of the day though, the main reason most games don't have stupidly high CPU requirements is simply that most games don't need it and software developers save themselves tons of headaches by not having any more threads than absolutely necessary beyond automatic threading from libraries and SDKs.
 
  • Like
Reactions: NightHawkRMX
No, they'll keep buying the slowly rising bottom of the barrel every 5-10 years. That's why most games' minimum requirements are hardware from 5+ years ago.

At the end of the day though, the main reason most games don't have stupidly high CPU requirements is simply that most games don't need it and software developers save themselves tons of headaches by not having any more threads than absolutely necessary beyond automatic threading from libraries and SDKs.
Interesting the comment about SDKs...

We now have DX12, consoles with 8 cores (well, 6 and a half; give or take), entry level (bottom of the barrel) CPUs with 4c/4t configs and most SDKs after 2014 are really well threaded compared to what was around in 2009-2010. Even 2012. Specialized libraries have been as wide-threaded as they could be from day 1.

Developers have no excuse to not use the extra resources found in modern PCs now, other than making simple games that don't really require them. Big majority of them, specially AAA ones, need to showcase the improvements or they will most likely fall flat on expectations. At least, that's my take.

Plus, I'm damn sure modern game engines like UE4 and Unity are up to snuff with threading. They've said as much, right?

I don't know what else most people need to realize they need to upgrade from 2c/4t bottom barrel CPUs of 2010 when they intend on playing games. They are delusional their CPUs will be enough for really modern games. And, like you said, Game Devs would be delusional to think 100% of their userbase will upgrade for them. A balance must be struck and that's where Intel, AMD, nVidia and OEMs come into play.

Cheers!
 

InvalidError

Titan
Moderator
Interesting the comment about SDKs...

We now have DX12, consoles with 8 cores (well, 6 and a half; give or take), entry level (bottom of the barrel) CPUs with 4c/4t configs and most SDKs after 2014 are really well threaded compared to what was around in 2009-2010. Even 2012. Specialized libraries have been as wide-threaded as they could be from day 1.
Having all the multi-threaded libraries in the world do you little good if they account for only 20% of total CPU time not because you aren't using them but because they represent only that much of the application's total processing workload.
 

jdwii

Splendid
100% GPU usage does not matter, what does is maintaining playable frame rates. GPU load varies massively depending on what is on screen and if you have 100% GPU load not much beyond 60fps on a relatively lightweight scene, you will encounter jarring frame rate dips in heavier scenes. Also, if you play with vsync enabled, 100% GPU load only happens when frame rate drops below vsync unless you enable triple-buffering at which point you are pissing away power rendering frames that never get to screen.

Yes, there is adaptive sync these days but how many average people care enough about it to ditch their existing monitors and TVs that typically have a useful life of 10+ years? My monitors are 8-15 years old and still working great, no reason to replace any of them yet. My newest addition is a $600 Samsung 7100 TV and I'm not planning to replace it until OLED or uLED reaches price parity with LCDs.

Though i agree with the "average gamer" comment and people using older monitors. I Personally love G-sync and Free-sync and I do see a lot of enthusiasts upgrading to higher-end monitors. I mean at the end of the day the monitor is everything you are looking at input lag and overall visuals start from their. Then again I guess I might not be the average gamer lol.
 
Last edited:

jdwii

Splendid
Main issue with quads is the lows. The averages are ok but stutter can make a game less fun.

For example, i get 80+ fps with my RX580 8gb and r3 1200 in gtav, however when explosions hit the screen or you are driving really fast, you will feel some stutters. I cpuld also blame some uf this stutter on my 8gb ram, but overclocking mem amd cpu helps a lot. I guess this 580 would have beem over $200 one day?

You would easily notice a upgrade to zen 2 a massive one indeed. Ryzen 1200 can even bottleneck a 1050Ti in situations. My friend had the RX 480 on his 8350 and i finally talked him into moving to Zen and he is extremely happy with his improved performance in games. He plays games more then me even and is into CS:GO and all that.


Edit as a side note finally ordered my 3700X as they are in stock at newegg after 11 days of waiting lol.

If i had to rank CPU deals in terms of gaming and heck productivity today I would state this
Ryzen 1600 for $100(heck of a deal!)
Ryzen 2600
Ryzen 3600 for $200
 
Last edited:
  • Like
Reactions: goldstone77

jdwii

Splendid
Does AMD still make 1st gen CPUs? The 1700 is like the price of the 3700x now even though the 1700 used to be the same price as a 2600.

That's a good question i wouldn't even know where to look for that answer. I mean it seems like you can easily find them anywhere. Also, a 1600+580 would be a nice combo. Man listen to me i sound like a Amd salesmen lol trying to get everyone here to upgrade their parts. :)

Na just we never had it so good in years(what since hammer?) in terms of CPU performance/competition and it's sad to see users ignore this time. I would never want people to waste money on something they wouldn't notice as an improvement.

I'm confident that in gaming and in some other cases I will see a difference from my 2700X to a 3700X. I use a lot of simulations and emulators so IPC+Frequency is very important to me I also own a 1440P 144hz monitor so higher frame rates matter a lot to me.

Even Sims 3 yes lol I know i'm a fan of the sims(except 4) but even sims 3 uses 2 cores on my 2700X at 90%+. Age of Empires 3 does the same thing so i'm 100% certain i'll see a difference.
 
  • Like
Reactions: jdwii

TRENDING THREADS