[SOLVED] HTPC: Gaming Upgrade?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Doogals

Distinguished
Dec 5, 2014
72
2
18,645
I have a low budget HTPC that I originally built in 2015 for streaming TV shows and movies online and also to use as a Blu-ray player (see specs below).

It was never intended to be a gaming machine, other than Minecraft or some of the old Valve titles like Portal and Team Fortress 2 (which are playable but the frame rate and resolution are not great).

My question is, would upgrading the GPU in this system be enough to improve its gaming performance?

If so, would anyone have a budget friendly recommendation for a GPU that would make a difference?

CPU (Integrated): Intel Celeron J1900 Bay Trail-D @ 1.99GHz
Motherboard: ASRock Q1900M
RAM: 8GB (2X4GB) G.SKILL DDR3 @ 666MH
GPU: 1024MB ATI AMD Radeon HD 5450 (XFX)
SSD (OS): Samsung 850 EVO 250GB 2.5”
HDD (Storage): WD Blue 1TB 3.5” (WD10EZEX)
SATA PCIe Controller: Syba SD-SA2PEX-2IR
Power Supply: Antec VP 450
Optical Drive: LG Blu-ray Disc Rewriter (WH14NS40)
OS: Windows 10 Home 64-Bit
Audio: AMD High Definition Audio
Case: Silverstone Grandia GD05B

(Note: Optical drive is connected via the SATA PCIe controller.)
 
Last edited:
  • Like
Reactions: lucyshuker
Solution
I do not think the fps improvement you might get to 10-12 is worth doing anything .
The power of your cpu is simply too anemic.

To upgrade, you will need a cpu, motherboard and ddr4 ram.
That can be as little as a Intel G5600 for $110, a B300 series motherboard for $60, and a 2 x 4gb DDR4 ram kit for $40.
You will get a G5600 4 thread processor with a passmark rating of 5660 and a much stronger single thread rating of 2257.
The G5600 also includes HD630 integrated graphics which has a rating of 1121, that is considerably faster than your current HD5450 at 232.
HD630 just gets you going.
About the cheapest gpu that can be considered for gaming might be the GT1030.

A similarly priced alternative would be a ryzen 2200G.
It includes a...

DSzymborski

Curmudgeon Pursuivant
Moderator
Wanted to post a quick follow up with my findings after playing two older game titles.

I installed The Lego Movie Videogame (2014) and Lego Batman: The Videogame (2008), and both seem to be running great without any issues.

So far I’ve only used their default video settings without making any adjustments, and Lego Movie averages around 30fps and Lego Batman closer to 60fps.

Could my issue with The LEGO Ninjago Movie Video Game (2017) be that it’s simply too current of a title for my underpowered system?

Well, it wouldn't be surprising. Both your CPU and GPU are much below the minimum requirements. A Celeron J1900 is much slower than a Quad Core 9300 or an A8-3850 and an HD 5450 is much less powerful than a GT 430.
 

Karadjgne

Titan
Ambassador
It's an issue with gamecode complexity, instruction sets, thread usage etc. You have a 4c/4t 10w cpu running at 2GHz. More modern titles are going to want upwards of 3GHz on processors using 50w+ and the cooling to match.

Your cpu simply doesn't have the horsepower. It was never intended for the rigors of modern gaming. It is a htpc after all, which will handle 4k playback of movies, videos, music, websurfing and older, simpler, less demanding games. Just not the new stuff.
 

Doogals

Distinguished
Dec 5, 2014
72
2
18,645
Update (two years and nine months later).

I purchased at GT 1030, and my expectations were definitely low, but I was shocked by the difference.

Here are Fraps benchmarks using the Lego Ninjago Movie Video Game with same settings and same level for both benchmarks:

HD 5450
LEGONINJAGO_DX11
Frames: 1185 - Time: 180000ms - Avg: 6.583 - Min: 5 - Max: 8

GT 1030
LEGONINJAGO_DX11
Frames: 13520 - Time: 180000ms - Avg: 75.111 - Min: 62 - Max: 85

Previously this game was unplayable, and while it’s far from the latest or greatest titles, I’m really excited to breathe a little more life into this system and keep it going.

It also has me wondering if something was wrong with the HD 5450, but not sure how I would check that?

Another note, due to low clearance under the GT 1030, I had to connect my secondary HDD using the PCIe Sata card I have for my optical drive. Is it possible this helped performance by freeing up PCIe bandwidth?

Very much looking forward to testing my other games to see the difference now.
 

Karadjgne

Titan
Ambassador
It's a simple matter of size vs power. The HD5450 is igpu. It's a small portion of a very small chip under the IHS. Your pinkie nail is 3-5x larger than an igpu. And all the power and ram is supplied by the motherboard, through the cpu, back through the motherboard to the outputs. That's a 2 lane highway in Los Angeles at rush-hour.

The GT1030 is simply massive in comparison. The gpu die is bigger than your thumbnail, a full 75w from the psu, it's own vram that's not relegated to scheduled sharing through the cpu, dedicated power stages, a heatsink it doesn't have to share with whatever the cpu is loaded to, but most importantly is the nvidia drivers.

Your cpu does very well with 2d applications like video. For that it's fine, 2d doesn't use up all that many resources. 3d is an entirely different ballgame. And an igpu is not set up to realistically deal with that. A discrete gpu is.

It's not that there was anything wrong with the HD5450, it's just woefully inadequate for 3d tasking.
 

DSzymborski

Curmudgeon Pursuivant
Moderator
Update (two years and nine months later).

I purchased at GT 1030, and my expectations were definitely low, but I was shocked by the difference.

Here are Fraps benchmarks using the Lego Ninjago Movie Video Game with same settings and same level for both benchmarks:

HD 5450
LEGONINJAGO_DX11
Frames: 1185 - Time: 180000ms - Avg: 6.583 - Min: 5 - Max: 8

GT 1030
LEGONINJAGO_DX11
Frames: 13520 - Time: 180000ms - Avg: 75.111 - Min: 62 - Max: 85

Previously this game was unplayable, and while it’s far from the latest or greatest titles, I’m really excited to breathe a little more life into this system and keep it going.

It also has me wondering if something was wrong with the HD 5450, but not sure how I would check that?

Another note, due to low clearance under the GT 1030, I had to connect my secondary HDD using the PCIe Sata card I have for my optical drive. Is it possible this helped performance by freeing up PCIe bandwidth?

Very much looking forward to testing my other games to see the difference now.

It's not surprising at all. The GT 1030 was an entry-level gaming GPU, but it was an actual gaming GPU that you could play games on. Even AAA games if you weren't too aggressive at pushing the quality settings.

By contrast, the HD 5450 was a non-gaming GPU released seven years before the GT 1030. It was never designed as something to game on. It was basically a "sometimes you need a GPU because the CPU doesn't have one and you need some kind of graphics to be able to physically see Windows" type of GPU.

The difference ought to have been night and day (and it appears to be for you). A GT 1030 isn't powerful, but hey, it's an actual car that you can drive to work. By comparison, the HD 5450 was a cheap plastic lawn chair, which will definitely not drive you anywhere.
 
  • Like
Reactions: Karadjgne

Doogals

Distinguished
Dec 5, 2014
72
2
18,645
Is the HD 5450 still considered an integrated GPU if it’s a dedicated PCIe card with 1GB of memory?

It's a simple matter of size vs power.The GT1030 is simply massive in comparison.

Looks like the HD 5450 die is 59 mm² and the GT 1030 die is 75 mm² so it’s 20% larger.

Big difference seems to be transistors, 292 million for the HD 5450 vs 1.8 billion for the GT 1030.

Interesting point regarding the drivers, didn’t realize the differences went beyond making sure the correct version was installed for your system.
 

DSzymborski

Curmudgeon Pursuivant
Moderator
Is the HD 5450 still considered an integrated GPU if it’s a dedicated PCIe card with 1GB of memory?



Looks like the HD 5450 die is 59 mm² and the GT 1030 die is 75 mm² so it’s 20% larger.

Big difference seems to be transistors, 292 million for the HD 5450 vs 1.8 billion for the GT 1030.

Interesting point regarding the drivers, didn’t realize the differences went beyond making sure the correct version was installed for your system.

Integrated GPU refers to a GPU that's actually on the CPU (or in older PCs, the motherboard chipset). A 5450 isn't an integrated GPU; it's a discrete office GPU.
 

Karadjgne

Titan
Ambassador
Is the HD 5450 still considered an integrated GPU if it’s a dedicated PCIe card with 1GB of memory?
That's my bad. Dunno what I was thinking, I see HD graphics so often associated as newer(ish) Intel igpu that I'd forgotten the old AMD used HD as well. The HD 5450 is somewhat ancient. That said, 292 million vs 1800 million isn't much different than 1/3rd of pinkie vrs thumb nails. And it's more than just the transistor count. There's also the vram, DDR3 vs GDDR5 which is massive difference, ROP count, chipset architecture (maxwell changed the game there) raster and pixel shaders etc.

Back in the 70's a good V8 5.7L was putting out just over 200HP stock. Today, same 5.7L can put out over 600HP, stock. And get better gas milage on corn syrup gas. In some respects, the 1030GT is anywhere from 300% to 500% stronger. Just based on drivers and architecture. Technological advances.
 

Doogals

Distinguished
Dec 5, 2014
72
2
18,645
The GT 1030 was an entry-level gaming GPU, but it was an actual gaming GPU that you could play games on.

This does have me wondering what kind of additional performance I would have seen from a GTX 1050, GTX 1050Ti or GTX 1650.

Since my CPU and RAM limit what the GT 1030 can do, would limiting a more powerful GPU have provided better results?
 
Depends on the game, the resolution, and in-game quality settings.

Simplified way a PC plays a game:
  1. CPU figures out what needs to be in a given frame (imagine a rough sketch) based on user and game world input. Issues draw call to GPU to tell it what to render.
  2. GPU receives draw call and makes a pretty picture. Sends to monitor when complete.
  3. The GPU can't do any work until the CPU tells it what to draw. Raising graphics settings and/or resolution increases the complexity of the GPU's job, making it take longer to render each frame. Lowering settings decreases the complexity of the GPUs job making it take less time to render each frame.
  4. If the GPU finishes rendering a frame before the CPU has finished figuring out what the next frame should contain, the GPU has to wait (<100% GPU usage).
  5. Based on #3 & #4, you should be able to optimize for 90% or greater GPU usage (depending on a game's CPU stress and the CPU/GPU balance of a system)
 

DSzymborski

Curmudgeon Pursuivant
Moderator
This does have me wondering what kind of additional performance I would have seen from a GTX 1050, GTX 1050Ti or GTX 1650.

Since my CPU and RAM limit what the GT 1030 can do, would limiting a more powerful GPU have provided better results?

I would expect, given the slowness of the CPU, that additional marginal gains would generally be small.
 
  • Like
Reactions: Karadjgne