Question Old games look different running on Windows XP and Windows 10 at the same resolution

Root602

Reputable
May 20, 2021
44
0
4,530
When I compared Hitman: Codename 47 running on my Windows 10 gaming PC and a Windows XP machine I noticed that they look completely different in subtle ways. At the same resolution on both machines, I found that on Windows 10 the game looks much more artificial, sharper, less realistic, there is less contrast, and the colours just look wrong and dull. Running the game on Windows XP, the game looks different because there are effects in game that you can't see when you run it on Windows 10. The game looks more impressive graphically, and it seems like the game is running as though it is meant to be seen for that time, in 2000. Much more contrast, more accurate colours, softer edges, and the game just looks like it is supposed to. I compared the mirror effect in the bathroom in both instances of the game, and on Windows 10 the mirror reflects Agent 47 incorrectly, and his model looks a bit transparent, compared to Windows XP. The tiles on the bathroom floor in Windows XP reflect light in many different directions, compared to Windows 10, where they only reflect in one direction. Not just with this game but all games I've tried from the 2000s just look better running on the system they were designed for. But when you run the same games on Windows 10 they just look wrong and artificial.

I've been trying to analyse what the reason behind these differences are, and at first I thought it had something to do with the version of DirectX. Windows XP runs DirectX9, and Windows 10 runs DirectX12. Windows 2000, which the game was originally released for could run the game on DirectX9. DirectX9 in Windows XP is the DirectX version used for the game originally compared to DirectX12 in Windows 10. I read on a forum somewhere that the features used in previous versions of DirectX are not always supported by newer versions, and some features are dropped with every new version. Maybe DirectX12 is missing some features in the DirectX which the game relies on? Microsoft support for older DirectX versions for 3D applications like video games has also never been good. Is this difference in DirectX versions on both OS's the reason for the difference in graphics?
 

Root602

Reputable
May 20, 2021
44
0
4,530
is your hardware identical on xp and 10?
try compatibility mode xp on 10
No.

Windows XP runs on a Pentium 4 1.6GHz Northwood, 64MB ram, and Nvidia GeForce 2 MX 400. Windows 10 runs on a i7 9750H, 16gb DDR4 ram, and GTX 1650 (laptop).

I tried compatibility mode on Windows 10 for Oblivion and surprisingly the graphics changed. Here are some screenshots comparing the game with compatibility mode off and on:



 
No.

Windows XP runs on a Pentium 4 1.6GHz Northwood, 64MB ram, and Nvidia GeForce 2 MX 400. Windows 10 runs on a i7 9750H, 16gb DDR4 ram, and GTX 1650 (laptop).
Does the old PC have an old monitor? A CRT, those with a tube in them, are very different in how they display images and old games generally look much better on them.
 

Root602

Reputable
May 20, 2021
44
0
4,530
Does the old PC have an old monitor? A CRT, those with a tube in them, are very different in how they display images and old games generally look much better on them.
It has an old Viewsonic LCD monitor from 2006 that's 1280x1024. Comparing Windows 10 and XP again shows the same differences even when using this monitor on both computers. It seems like compatibility mode fixes these graphical differences when running games on Windows 10 but I'll have to check in detail. Interestingly old games do actually look better on the old monitor because it's analog but it's not related to the graphics themselves.
 
Your seeing what I see as well. Honestly I think its the hardware made that the game played on at the time.
I still have my 2004-05 parts and I get bored now and than and fire them up and where I first noticed with my 6800GT I played Crysis and going back I swore I had put on a HD texture pack it look that awesome. It was lush with deep colors the grass looked like I remember and the water that wow factor we saw back than.

Crysis still looks great on my modern systems but yes had a different crisp look back than.

Tomb Raider Anniversary
Tomb Raider Legend
FarCry all have a hint of a different look. IDK that feeling like when you get a new pair of glasses and for the first few day the world looks different till your eyes adjust, than one day you switch back on your old glasses and it feel familiar.

I look at it like this at least our older games still play and look great on our modern GPU's vs going back and playing a Voodoo 3D FX GPU Game release like the original Tomb Raider and your really scratching your head as It looks like a pixel salad with textures.

But if you ever see that Original game of tomb Raider played on time period Voodoo 3D FX the game looks awesome for where we were in 2000.
 
Last edited:
Interestingly old games do actually look better on the old monitor because it's analog but it's not related to the graphics themselves.
Being analog would have nothing to do with it. It should actually be slightly worse because LCD monitors are still digital technology, so using VGA would mean you're converting digital to analog, then back to digital again. You probably just prefer how the older LCD looks for these games.

Your seeing what I see as well. Honestly I think its the hardware made that the game played on at the time.
I still have my 2004-05 parts and I get bored now and than and fire them up and where I first noticed with my 6800GT I played Crysis and going back I swore I had put on a HD texture pack it look that awesome. It was lush with deep colors the grass look like I remember and the water that wow factor we saw back than.

Crysis still looks great on my modern systems but yes had a different crisp look back than.

Tomb Raider Anniversary
Tomb Raider Legend
FarCry all have a hint of a different look. IDK that feeling like when you get a new pair of glassed and for the first few day the world looks different till your eyes adjust, than one day you switch back on your old glasses and it feel familiar.

I look at it like this at least our older games still play and look great on our modern GPU's vs going back and playing a Voodoo 3D FX GPU Game release like the original Tomb Raider and your really scratching your head as It looks like a pixel salad with textures.

But if you ever see that Original game of tomb Raider played on time period Voodoo 3D FX the game looks awesome for where we were in 2000.
I would argue a lot of it has to do with how older graphics cards had to basically "cheat" to attain acceptable performance while rendering to a 32-bit color output, even by the time selecting between 16-bit and 32-bit colors was no longer an option (I think this was around 2002-2003?)

Like for example, anisotropic filtering pattern tests used to be a thing in the mid-2000s because video cards used to produce funky results: https://www.anandtech.com/show/1293/8 . Allegedly the algorithm that produces the pattern was chosen because it had less impact on performance.

And I found this: https://www.tomshardware.com/reviews/performance-leap,789-47.html . The linked section is comparing how the GeForce 6800 renders Far Cry compared to the GeForce FX 5950 and the Radeon 9800.

So now that modern GPUs have more than ample power to handle FP32 per color channel per pixel these days, we may be seeing what we get without all these "optimizations" because it's not necessary anymore, with unintended side effects to image quality.
 
I would argue a lot of it has to do with how older graphics cards had to basically "cheat" to attain acceptable performance
I agree, remember when Valve had the great big demo for Half-life 2 and ATI 9800pro was the card being used at the new's media + dinner event Valve put on. It was a disaster, the game played like crap something about the 4 render pipelines being to long on the ATI card made the timing off with the game engine.

Valve went back to the game and did a patch that smoothed out that issue and people were pissed who shelled out for the ATI card but in the end Valve fixed it.

People were pissed again a few months later with the same ATI cards as it seemed all the next years games needed shader 3.0 and your less than a year old ATI card had shader 2.0

But as a end user with all the background faking or tricks what you seen on screen it's what you remember.
 

Root602

Reputable
May 20, 2021
44
0
4,530
I found a good comparison. Compare this video of Medal of Honor: Allied Assault:
View: https://www.youtube.com/watch?v=a1I78HWFfDI&pp=ygUPOTgwMCBhdGkgZ3RhIHNh


With this one:
View: https://www.youtube.com/watch?v=AAy41daco4U&pp=ygUobWVkYWwgb2YgaG9ub3IgYWxsaWVkIGFzc2F1bHQgd2luZG93cyAxMA%3D%3D


With the retro video card there is more detail on the rifle, better contrast, and the lighting is better. Look at the lamps, way brighter with correct colour on the retro video card. On the RTX 2080 the lamp effect is wrong. Overall the retro video card renders the game way, way better and correctly. With the modern video card, everything looks artificially aged, drab, and dull.
 
Well I do have a Windows 98 computer I could try, and Medal of Honor: Allied Assault is probably one of the few games that can run on Windows 11 without issue.

Now all I need to find is a DVI to HDMI converter.

However I will say something seems off about the first video in terms of coloring. It looks oversaturated and the gamma correction looks off.
 
So I fired up Medal of Honor: Allied Assault on my current PC and my Windows 98 PC and played around in the first level. I can't see any practical difference that would make the game look better on one hardware or the other, though I did notice my current PC has issues with seams on certain 3D objects.

As a point of reference my Windows 98 PC is running a Radeon 7500.
 
I realized since OP posted screenshots of Oblivion and they have differences, I should be able to do the same with Medal of Honor: Allied Assault.

On my current computer (the seam in the skybox is a glitch)
mohaa-4070ti.png


On my Windows 98 computer
mohaa-r7500.png


If there's any practical difference between the two, I don't see it.
 

Root602

Reputable
May 20, 2021
44
0
4,530
I realized since OP posted screenshots of Oblivion and they have differences, I should be able to do the same with Medal of Honor: Allied Assault.

On my current computer (the seam in the skybox is a glitch)
mohaa-4070ti.png


On my Windows 98 computer
mohaa-r7500.png


If there's any practical difference between the two, I don't see it.
If I look very closely, and zoom into the bottom image and compare it to the top image, the bottom image looks overall softer and the rifle looks darker. What do you mean by practical difference? Practically what modern Windows is doing is enough for most people, but not for me. I am a perfectionist and can't stand a game looking different than what the developers intended. I think these small differences are why some people build retro gaming PCs.

Is compatibility mode for Windows 98 turned on for the first image? The images might look identical then if it's turned on.
 

DSzymborski

Curmudgeon Pursuivant
Moderator
If we're going to post screenshots in this thread, it would be helpful if posters *don't* label them at the time they post them. Short of actual A/B testing, it's the best way to mitigate the placebo effect.
 
If I look very closely, and zoom into the bottom image and compare it to the top image, the bottom image looks overall softer and the rifle looks darker. What do you mean by practical difference? Practically what modern Windows is doing is enough for most people, but not for me. I am a perfectionist and can't stand a game looking different than what the developers intended. I think these small differences are why some people build retro gaming PCs.

Is compatibility mode for Windows 98 turned on for the first image? The images might look identical then if it's turned on.
There's the flaw that I wasn't in the exact same spot, so I can go run this again and transfer my save files over. Plus that makes it easier to run a filter in an image editor to see where the differences actually are, if any.

And I realized it now but I used different screenshot methods. On Windows 98 it let me do Print Screen, but Windows 11 had issues, but I found the game has something that takes screenshots, but in TGA format (whatever that does)

Also Windows 98 doesn't have compatibility features. I believe that was introduced in Windows XP, if not Vista.

If we're going to post screenshots in this thread, it would be helpful if posters *don't* label them at the time they post them. Short of actual A/B testing, it's the best way to mitigate the placebo effect.
The point I'm trying to make isn't which one looks better, it's how much of a difference there is between the two. So when I say "practical difference", I'm saying for example an RGB value of say 128, 54, 92 has no practical difference to 128, 55, 92. Yes there is a difference, but you're not going to really notice it.
 
So I went back to MOHAA, made saves so that I could have a consistent position from where to take screenshots from. I also noticed that in my previous attempt, I left image sharpening on the driver settings, so I turned that off. And to make sure I'm using the same screenshot capture method, I used the in-game one.

Here's a link to the album: View: https://imgur.com/a/ty0H0wl
. The album includes:
  • 3 different scenes
  • Render from the RTX 4070 Ti with no compatibility mode
  • Render from the RTX 4070 Ti with Windows XP SP3 compatibility mode
  • Render from the Radeon 7500
  • Difference filter between the 4070 Ti without compatibility mode enabled and the 7500
  • Difference filter between the 4070 Ti with compatibility mode enabled and the 7500
  • Difference filter between the 4070 Ti with and without compatibility mode enabled
The difference filter pretty much makes it conclusive that there's very little difference between the rendering modes, especially with compatibility mode enabled and disabled. Note that the sky box in the scenes are still moving and in the third image, the NPC moves as well, so they shouldn't really be considered in the comparison.

The only reason why I didn't do a difference filter on one of the scenes is because I couldn't get a consistent image, but I wanted to include it anyway just to have another scene.
 

Root602

Reputable
May 20, 2021
44
0
4,530
So I went back to MOHAA, made saves so that I could have a consistent position from where to take screenshots from. I also noticed that in my previous attempt, I left image sharpening on the driver settings, so I turned that off. And to make sure I'm using the same screenshot capture method, I used the in-game one.

Here's a link to the album: View: https://imgur.com/a/ty0H0wl
. The album includes:
  • 3 different scenes
  • Render from the RTX 4070 Ti with no compatibility mode
  • Render from the RTX 4070 Ti with Windows XP SP3 compatibility mode
  • Render from the Radeon 7500
  • Difference filter between the 4070 Ti without compatibility mode enabled and the 7500
  • Difference filter between the 4070 Ti with compatibility mode enabled and the 7500
  • Difference filter between the 4070 Ti with and without compatibility mode enabled
The difference filter pretty much makes it conclusive that there's very little difference between the rendering modes, especially with compatibility mode enabled and disabled. Note that the sky box in the scenes are still moving and in the third image, the NPC moves as well, so they shouldn't really be considered in the comparison.

The only reason why I didn't do a difference filter on one of the scenes is because I couldn't get a consistent image, but I wanted to include it anyway just to have another scene.
For the image of the difference filter between the 4070 Ti with compatibility mode enabled and the 7500, with the brightness increased you can see the differences I noticed before in the same areas. I remembered seeing the top left window softer than in the top image, and sure enough both that and the differences in the rifle are highlighted. This means that there is a difference between retro video cards versus new video cards. It might be small but there definitely is a difference and if I could notice them without being aware that they existed before you posted that image of the difference filter, that means other people can, and they do. And compatibility mode makes no difference at all in the rendering. Which means that GPU manufacturers don't care whether old games look the same on their GPUs, and only care about new games.
 
For the image of the difference filter between the 4070 Ti with compatibility mode enabled and the 7500, with the brightness increased you can see the differences I noticed before in the same areas. I remembered seeing the top left window softer than in the top image, and sure enough both that and the differences in the rifle are highlighted. This means that there is a difference between retro video cards versus new video cards. It might be small but there definitely is a difference and if I could notice them without being aware that they existed before you posted that image of the difference filter
All I really see between the images is a technical difference. I don't see a difference that most people would see a practical difference in. While I could go take some more screen caps and do a blind test, most of this exercise anyway was to find evidence that there wasn't a major difference in the render output as you tried to suggest in https://forums.tomshardware.com/thr...at-the-same-resolution.3820192/#post-23103923. So I think this exercise in particular ran its course.
 

Root602

Reputable
May 20, 2021
44
0
4,530
I will attach my own screenshots to this post later today to prove that there is a big difference and it’s not just “pixels” and a “technical” difference.
 

Root602

Reputable
May 20, 2021
44
0
4,530
It appears that my monitor makes games look better on it, and my laptop screen makes the game look worse. That accounts for the differences in graphics I see, so there is no difference between new and old GPUs or Windows XP and Windows 10 for old games. I have a particular problem with the way my laptop displays games. Just to let you know that the aspect ratio is stretched on the named photos. Screenshot 1 and 2 is from the old computer. There is a problem with the colour of lights. On Sam's (the character's) nightvision the lights are glowing, when they're not supposed to be glowing. And in the second photo the ceiling lights are supposed to emit yellow light not blue light like on my laptop. Does anyone know what the problem could be?


Update: looks like there is a difference in graphics cards after all.
 
Last edited: