[SOLVED] GTX 1650 laptop /terrible FPS in AC games

Mar 3, 2020
12
1
15
So I have bought a Asus fx705DT laptop with gtx1650 4gb card and ryzen5 3550 cpu, in every AAA game so far the fps is what is should be including the witcher3 where I get 60fps on high at 1080p, or FarCry5 or reddeadredemption2 is pretty fine too , and so on. The fps is pretty decent, lowering-increasig graphics/resolution does affect fps. However that's not the case when I try to play Ac unity or Ac origins, both games running terribly with massive fps drops. even drops below 30 fps, not even in menus have 60fps lol sometimes, the graphics settings/resolution doesn't improve a single fps in these two games! I can play on 720p with literally full low, settings, I get the excat same fps like on high settings in 1080p. I have checked discussions , requesting certain changes in nvidia controlpanel-inspector etc, I have done all those, everything on max performance including my power settings, they have zero impact on the fps in these cases no improvement at all, I am playing while plugged in ofc. I assume something is surely wrong here, as I have seen a review from a guy who bought this laptop and gets 60 fps in Unity with high settings in 1080p for instance, so it probably can't be CPU bottleneck right? I am clueless what can be the problem at this point, thanks for the help in advance.
 
Solution
AC: Unity was and pretty sure is still poorly optimized.

As for the FPS dips I'd check a few things...

-You say it's plugged in but what power settings are you using?
-What speed is the memory running in the BIOS? AMD Ryzen performs better with fast memory and I think with the Ryzen laptops they usually put in 2666 however if DOCP (ASUS version of XMP) isn't enabled then it's going to run at 2133 which will affect gaming performance and FPS dips.
-Monitor temps, while laptop parts can thermally run a bit higher then there desktop counterparts they could still be getting high enough were thermal throttling could be affected.

Since lowering the graphic settings/resolution had little impact on the FPS then the CPU is the culprit. The...
AC fps drops would be related to CPU
minimum is 4core/4threads 3.1GHz (desktop 65watt cpu) for 720p @ 30fps
recommended is 4core/8threads ~3.4GHz (desktop 65watt cpu) for 1080p @ 30fps
your CPU is cutdown 35watt, so i guess u should limit fps in game to 30fps, so it wont have massive fps drop
 
Mar 3, 2020
12
1
15
AC fps drops would be related to CPU
minimum is 4core/4threads 3.1GHz (desktop 65watt cpu) for 720p @ 30fps
recommended is 4core/8threads ~3.4GHz (desktop 65watt cpu) for 1080p @ 30fps
your CPU is cutdown 35watt, so i guess u should limit fps in game to 30fps, so it wont have massive fps drop
I have limited it to 35, 30 as well but unfortunately in certain cases it goes down as low as 25 in big crowds which is just unacceptable, and as i checked benchmarks this laptop does 35fps on high and 48 in low in Odyssey for instance. However unity and origins doesn't respond to graphics settings either, no matter how much I lower, fps is the same
 

WildCard999

Titan
Moderator
AC: Unity was and pretty sure is still poorly optimized.

As for the FPS dips I'd check a few things...

-You say it's plugged in but what power settings are you using?
-What speed is the memory running in the BIOS? AMD Ryzen performs better with fast memory and I think with the Ryzen laptops they usually put in 2666 however if DOCP (ASUS version of XMP) isn't enabled then it's going to run at 2133 which will affect gaming performance and FPS dips.
-Monitor temps, while laptop parts can thermally run a bit higher then there desktop counterparts they could still be getting high enough were thermal throttling could be affected.

Since lowering the graphic settings/resolution had little impact on the FPS then the CPU is the culprit. The fix is to overclock but it's usually not recommended on a laptop since the thermals normally are quite high.


One last thing to maybe look into is have Nvidia Geforce Experience "optimize" the game. While I normally don't recommend this for most especially those with higher refresh monitors as NGE doesn't tweak the games well it may be something to consider using since it's a 1080P/60hz display. If you've done this are you still experiencing the dips?
 
  • Like
Reactions: Kris1101
Solution
Mar 3, 2020
12
1
15
AC: Unity was and pretty sure is still poorly optimized.

As for the FPS dips I'd check a few things...

-You say it's plugged in but what power settings are you using?
-What speed is the memory running in the BIOS? AMD Ryzen performs better with fast memory and I think with the Ryzen laptops they usually put in 2666 however if DOCP (ASUS version of XMP) isn't enabled then it's going to run at 2133 which will affect gaming performance and FPS dips.
-Monitor temps, while laptop parts can thermally run a bit higher then there desktop counterparts they could still be getting high enough were thermal throttling could be affected.

Since lowering the graphic settings/resolution had little impact on the FPS then the CPU is the culprit. The fix is to overclock but it's usually not recommended on a laptop since the thermals normally are quite high.


One last thing to maybe look into is have Nvidia Geforce Experience "optimize" the game. While I normally don't recommend this for most especially those with higher refresh monitors as NGE doesn't tweak the games well it may be something to consider using since it's a 1080P/60hz display. If you've done this are you still experiencing the dips?

Thanks for the reply ! I have just checked the memory speed and yes, it is "2667". When plugged in I even tried High performance and Ultra performance Power options, processsor rates are on 100%, I get the excat same performance like on "Balanced". I was checking the temps, these two AC games run slightly around 78-85C most of the time on the cpu, and around 70C the gpu, but these cpu temps haven't affected any other game yet in terms of performance. Yes I have tried the Geforce optmization, in Origin's case it did a really minimal improvement when i launched the game trough nvidia exprience after optimization but really just a little. I have tried it in Unity's case but that just remains as bad as it was without :/
I just don't understand how benchmarks all over the internet got relatively great results with these games on this Laptop with the basic variant and I have so much worse performance. I mean, even on my old pc the i5 4460 and gtx960 2GB performed better than this in unity and origins which is kinda non sense.
 

WildCard999

Titan
Moderator
Are you running other software? I'd close down pretty much everything that doesn't have to do with playing Unity and see if that helps.

When I got my laptop recently I had to dump some of MSI's bloatware and Norton AV as well to get the most from the system.

If those games were recently downloaded it probably wouldn't hurt to verify the game files. Recently got back into BF4 and it ran terrible, reverified files (2 were missing) and it made a massive difference, almost 25 FPS.
 
Mar 3, 2020
12
1
15
Are you running other software? I'd close down pretty much everything that doesn't have to do with playing Unity and see if that helps.

When I got my laptop recently I had to dump some of MSI's bloatware and Norton AV as well to get the most from the system.

If those games were recently downloaded it probably wouldn't hurt to verify the game files. Recently got back into BF4 and it ran terrible, reverified files (2 were missing) and it made a massive difference, almost 25 FPS.

No other sowftware running , I have disabled my AVG to see there is any change, but no change at all. I did verify the files, no improvement :(
 

WildCard999

Titan
Moderator
Since your saying Witcher 3/RDR2 are running well then it's probably poor optimization on Ubisoft's end, shocking! (sarc). I'll try it on my laptop when I get home in a few hours and let you know how it runs. The system is pretty close with the same memory/GPU but I do have a better CPU, the Intel 9300H but at least if my FPS aren't much higher then it kind of reinforces the idea of bad optimization.

The only upgrade that "could" make a difference in performance would be adding a 2x8gb kit at the fastest speed the motherboard supports into the build.
 
  • Like
Reactions: Kris1101
Mar 3, 2020
12
1
15
Since your saying Witcher 3/RDR2 are running well then it's probably poor optimization on Ubisoft's end, shocking! (sarc). I'll try it on my laptop when I get home in a few hours and let you know how it runs. The system is pretty close with the same memory/GPU but I do have a better CPU, the Intel 9300H but at least if my FPS aren't much higher then it kind of reinforces the idea of bad optimization.

The only upgrade that "could" make a difference in performance would be adding a 2x8gb kit at the fastest speed the motherboard supports into the build.

Yeah Witcher3 runs the way it should just like the benchmarks showed, red dead2 with ultra textures, rest medium runs just as it should though I locked it on 40fps but it didn't go below 30, fps games like cod ww2 bf1 run they way they should 50-60 fps I tried Black Flag I know its an old but it ran greatly too, although the AC3 remaster had fps drops, on ultra went below 40 at times which is ridicolous and points to Ubisoft's shitty optimimzation thats sure, all they did was changing the lighting from the 2012 version apparently. I haven't tried Odyssey yet, I will download it tomorrow and see if the same issue persists , according to benchmarks it definitely shouldn't...
Thank you for the check, I am curious what the result will be.

My fps in unity is most of the time 30-40 on the streets, 45-50 on rooftops regardless of the graphics settings. In big palaces with detailed interior and crowds can drop to 28-29 but i have seen 25 as well like i said.
Yeah, I was thinking about getting another 8GB ram, allegedly it helps these Ryzen mobile processors massively, that might could help.
 

WildCard999

Titan
Moderator
AC: Odyssey is a tough game to run, I run it between 60-75 FPS on a Ryzen 2600 (PBO activated), 2x8gb@3200 & 1660S on Medium at 2560x1080P but will have some rare dips (city heavy combat) to mid 50s. Fortunately the monitor is Freesync and works with my Nvidia GPU so they are not noticeable. I imagine running it on the laptop may be a bit tricky to get smooth but I may be able to try that one out as well.

If you get more memory you'll want to replace that current stick as well as it would slow down the additional stick if it was a faster speed which is ideal.
 
  • Like
Reactions: Kris1101
Mar 3, 2020
12
1
15
AC: Odyssey is a tough game to run, I run it between 60-75 FPS on a Ryzen 2600 (PBO activated), 2x8gb@3200 & 1660S on Medium at 2560x1080P but will have some rare dips (city heavy combat) to mid 50s. Fortunately the monitor is Freesync and works with my Nvidia GPU so they are not noticeable. I imagine running it on the laptop may be a bit tricky to get smooth but I may be able to try that one out as well.

If you get more memory you'll want to replace that current stick as well as it would slow down the additional stick if it was a faster speed which is ideal.

I can't believe what has just happened, so I just decided to try one last thing, the simpliest tiniest thing, thinking that wouldn't change anything at all. I disabled my wifi. I have paused unity at a certain checkpoint in a mission where I ran the same circles during each check to see the fps, with wifi disabled I am clearly getting 10-12 fps more in the crowd, lowest was 35 now where it was 26-27, and where it used to be 35 its 45-47 now on (High settings 1080p) lol
 
  • Like
Reactions: WildCard999

WildCard999

Titan
Moderator
That actually does make some sense as those coop points are all through the map and I think at some point during the campaign you can see shadow outlines of other players so without the wifi that's less taxing on the build. Since I think the coop is a Unity only thing that may be the only game that you may see the improvement on when disabling but may be worth testing out.
 
  • Like
Reactions: Kris1101
Mar 3, 2020
12
1
15
You might be right about that yeah, but it seems like a quite drastic improvement. Tomorrow I will definitely check both Origins and Odyssey with wifi disabled versions as well.
 

WildCard999

Titan
Moderator
To think Ubisoft's optimization is so bad you have to disable wifi for single player games...:ROFLMAO:

65AyK5l.png
 
  • Like
Reactions: Kris1101

WildCard999

Titan
Moderator
Yea when changing the graphics has no effect on the FPS then it's the CPU although I think upgrading the memory could make a significant difference. I did testing on my desktop with the difference between 2133 & 3200 and the gaming performance was almost 30 FPS or going from frequent and drastic FPS dips to none or very few and the dips were mild.

At least with this improvement it puts you more in the playable area instead of unplayable stuttery mess.
 

WildCard999

Titan
Moderator
So I played Unity for a little bit and it actually played pretty good. Used NGE to optimize game which it set Environment/texture to Ultra High and Shadow to Soft Shadow (PCS). Resolution was at 1080P. In the city the FPS bounced around a bit but it stayed mostly at 50 FPS, max was 60 and when it dipped was 40 at the lowest. Disabling wifi didn't increase the FPS for me.

With the performance difference between the 9300H & 3550H this seems about right as open world games are CPU intensive.
https://cpu.userbenchmark.com/Compare/Intel-Core-i5-9300H-vs-AMD-Ryzen-5-3550H/m744904vsm718601

That being said if you do end up doing the upgrade for the memory it could possibly help. Will report back in a bit on Odyssey which may be better to test on since it has the in game benchmark.
 
Mar 3, 2020
12
1
15
Odyssey, set with NGE (Mostly Low settings with 1-2 @medium). Min: 32, Avg: 45, Max: 60. No difference with wifi on or off.

So far I have only tested Origins today, and yes with wifi disabled I got more fps in Alexandria 30-36 on main street instead of 25-31 (I tested it in the same spot, same daytime-lighting) and outside big towns about 8-9 fps more with wifi disabled lol (medium settings) this is still all so weird though, how lowering settings doesn't even improve a single fps in these games.
 

WildCard999

Titan
Moderator
So far I have only tested Origins today, and yes with wifi disabled I got more fps in Alexandria 30-36 on main street instead of 25-31 (I tested it in the same spot, same daytime-lighting) and outside big towns about 8-9 fps more with wifi disabled lol (medium settings) this is still all so weird though, how lowering settings doesn't even improve a single fps in these games.

It's the CPU.

"Now, here’s what you’ve all been waiting for…
  • If lowering the graphics settings has no effect on frame rates, then the bottleneck is your CPU
  • If lowering the graphics settings increases the frame rate, then your GPU is reaching its upper limits
That’s pretty straight-forward, isn’t it?"

Source:
 
  • Like
Reactions: Kris1101
Mar 3, 2020
12
1
15
It's the CPU.

"Now, here’s what you’ve all been waiting for…
  • If lowering the graphics settings has no effect on frame rates, then the bottleneck is your CPU
  • If lowering the graphics settings increases the frame rate, then your GPU is reaching its upper limits
That’s pretty straight-forward, isn’t it?"

Source:

Yeah, prob that's the sad truth :( , but I don't know why other users haven't complained about this, and their settings change did have decent impact on performance. I even checked on a site whether this cpu is an acceptable match with this GPU and allegedly it should be in proper correlation without bottleneck. Regardless, I am planning to get that double 8GB ram...
 

WildCard999

Titan
Moderator
If you're referring to YT vids I wouldn't exactly trust them. While it makes no sense to me why someone would lie about performance it happens more then not.

Even sites such as NotebookCheck are fine to use however when checking game benchmarks if your looking at a certain mobile GPU you need to also check which CPU was running with that GPU.
 
Mar 3, 2020
12
1
15