News Cyberpunk 2077 Bug Bulldozes AMD Ryzen: Here's a Fix, Tested

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Just wanting to share please take this with grain of salt because I did too. So I did hex edit sometimes called SMT and did Pool CPU and GPU change to Cyberpunks files. I was very skeptical but thought I would try because saw enough feedback and was an simple thing to do. Before do this I was at 70fps in crowded city. After I get 85fps and thought maybe I got it wrong so I went to most crowded part of city I could and I did get a 15-20% boost in fps. I try explain. Was told this had to do with settings for playstation and xbox not the pc. Adjust was in notepad to parameters to match my pc. 16gb ram divide by 2 so 8gb ram and 8gb vram on gpu vs. too low settings for the consoles. Would post a link but dont think it would be allowed but if you do the google search I'm sure you find it easy. I not promote this just share my experience :)
 
Why is the 10900k performance so much better in the memory quantity tests? The mins are practically the same as the 5950x averages in this article.
Different areas tested? Does memory overclocking do that much for Intel?
 
In the article you wrote that the Ryzen 7 1800x saw little gains with the "fix", but all of your charts show the opposite. It seems to me that you got it backwards on one of the two. Please fix this, so it doesn't mislead more people.
 
As a 3900X owner, this bug concerns me. This is yet another reason to wait 3-9 months to play Cyberpunk once it's actually complete and performing well. Having employees crunch to complete a game results in all sorts of small bugs like this. There's probably all sorts of things coded into the game but disabled at the moment because of the rush to release.

Besides, I'm not entirely sure a 1070 GPU is going to provide a good enough experience. Maybe a 6800 or 3070 will, but who knows when they'll actually be in stock again.

For a campaign game, you don't really need 60+ fps. Using my Vega 64 with freesync, the action looks fluid and smooth, especially if you're used to console where most games play at 30fps.

My Vega 64 with my 3900x gets about 40-45fps at 1440p Medium preset. Which looks amazing. High and Ultra don't really look much better to me, honestly feel like diminished returns when bumping up the settings.

I also didn't need to do any of these fixes, my GPU is at 99% the entire time and using about 7.5GB of VRAM. The total game uses about 10GB of system RAM.
 
  • Like
Reactions: hotaru.hino
I think you need enough resolution to make higher presets worth it. Take for instance a face that is typically 512 pixels or more. If the face only takes up 256 pixels one screen, what would be the point in doubling that?
I think there is a disconnect between the developer and player in explaining a good texture size to screen resolution. For something like a PS4 or XBox One that is outputting at 1080p, your pixel density is going to be limited. I also don't think game education institutions and developers understand this concept well. When determining texture size, start with a metric like 256 pixels per meter. Then estimate the area a texture will take up and apply the proper size. If you know you will be working with 4k, then go up to 512 pixels per meter. Applying some math, you can also determine the LoD distances.
 
Last edited:
I think you need enough resolution to make higher presets worth it. Take for instance a face that is typically 512 pixels or more. If the face only takes up 256 pixels one screen, what would be the point in doubling that?
I think there is a disconnect between the developer and player in explaining a good texture size to screen resolution. For something like a PS4 or XBox One that is outputting at 1080p, your pixel density is going to be limited. I also don't think game education institutions and developers understand this concept well. When determining texture size, start with a metric like 256 pixels per meter. Then estimate the area a texture will take up and apply the proper size. If you know you will be working with 4k, then go up to 512 pixels per meter. Applying some math, you can also determine the LoD distances.
Using higher-resolution textures usually doesn't impact performance much though, provided the graphics card has enough VRAM that it doesn't need to keep swapping textures to system memory. On PC, users get the option of adjusting texture quality on cards with limited VRAM. And on the consoles (Which are actually rendering this game at around 720p or below for the base models), texture quality is similarly reduced as needed to keep within their limited memory capacity. So, texture resolution is not likely affecting performance too much, though I suppose it could potentially cause stutters when streaming data off the slow laptop drives in the consoles.
 
Last edited:
I think you need enough resolution to make higher presets worth it. Take for instance a face that is typically 512 pixels or more. If the face only takes up 256 pixels one screen, what would be the point in doubling that?

I think there is a disconnect between the developer and player in explaining a good texture size to screen resolution. For something like a PS4 or XBox One that is outputting at 1080p, your pixel density is going to be limited. I also don't think game education institutions and developers understand this concept well. When determining texture size, start with a metric like 256 pixels per meter. Then estimate the area a texture will take up and apply the proper size. If you know you will be working with 4k, then go up to 512 pixels per meter. Applying some math, you can also determine the LoD distances.
Renderers don't care what the screen size is until the rasterization step. While I haven't had experience with 3D programming, I do know that adding textures is done before the rasterization step. Everything after the rasterization step is figuring out the final color of the pixel after taking account of all the applied textures. Plus the size of a mesh constantly changes with how the camera moves. Unless you're willing to figure out the 99% "view cases" of each mesh in your game, there's no point in trying to optimize which texture size to use at this level. LoD hints and mipmapping work just fine.

And as mentioned, texture sizes only cause a performance impact if you're running out of VRAM. Otherwise the performance impact is minimal since it's just a LUT for the renderer.
 
No sure textures on high but others can be turned down that impact performance such as fluffy clouds and one of those shadow settings. So I try the FXfideility settings first time and game crash after maybe 10 minutes. There is supposedly patch coming so hopefully its good.