News AMD vs Intel Integrated Graphics: Can't We Go Any Faster?

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
A more fair comparison would've been to use the Intel Iris GPUs since the UHD 630 is a mid range Intel iGPU. The Iris Pro 580 from 6th gen Skylake is still Intel's most power iGPU to date, still a bit more powerful than even the latest Iris Plus iGPU in Coffee Lake. But either way, even the Iris Pro 580 is still way behind the GTX 1050 in performance by all measures.
I believe the Iris Pro 580 didn't actually see widespread use though. It was used in some Mini-PC NUCS that cost around $1000, while still not quite offering the gaming performance of an entry-level PC with a dedicated card costing significantly less, making it kind of pointless. So it probably makes sense to stick with testing the common integrated solutions that people are far more likely to encounter.
 
I believe the Iris Pro 580 didn't actually see widespread use though. It was used in some Mini-PC NUCS that cost around $1000, while still not quite offering the gaming performance of an entry-level PC with a dedicated card costing significantly less, making it kind of pointless. So it probably makes sense to stick with testing the common integrated solutions that people are far more likely to encounter.
Yeah, Iris Pro Graphics 580 was not widely used. Incidentally, I did have a Core i7-5775C processor, which I also tried in the same Z97 motherboard as the i7-4770K. That has Iris Pro Graphics 6200, which is basically the same as HD 4600 except with eDRAM and twice the EU/shader count. Unfortunately, the drivers for whatever reason were even worse than on the 4770K!

3DMark ran fine, but only two games out of nine would actually run -- The Division 2 (DX11) and Final Fantasy XIV. Metro Exodus would crash/hang before getting to the benchmark sequence. Borderlands 3 would crash to desktop before getting to the main menu. Far Cry 5 would crash/hang before getting to the menu. Forza Horizon 4 could get to the main menu screens, but would crash to desktop whenever I tried to continue the game or start the benchmark. Red Dead Redemption 2 would crash/hang before getting to the loading screens (before the menu). Same for Shadow of the Tomb Raider and Strange Brigade.

Performance in the games / tests that did work was about 25% higher than UHD Graphics 630, so better performance but horrible compatibility. This was with the most recent Intel 15.40.45.5126 drivers from March 25, 2020 -- the newer drivers have not included support for 5th Gen, and I'm curious to see if support has officially been ended, or if it will just get less frequent updates. Probably ended is my guess.
 
  • Like
Reactions: Chung Leong
It's all about money for sure -- but not the way you're suggesting. No one has successfully made a fast iGPU solution for PCs. You're saying Microsoft paid off not just AMD, but every other company that might think about doing such a thing. Not a chance. And equally unlikely MS was able to pay AMD off.

Easiest way to disprove that assertion: If MS could pay AMD to not make a faster iGPU for PC ... wouldn't it make far more sense to pay AMD to not make such a thing for Sony? Microsoft isn't worried about competition from PCs killing the market for Xbox at all. It hasn't been a problem since the original Xbox; why would things change now?

And $50 for 4GB of HBM2 is extremely expensive on a component level. I gave math earlier, but basically a chip like Picasso costs AMD around $35 to make, and Renoir is maybe $50-$60. So if AMD put a $60 chip with $100 of HBM2, it would need to either sell tens of millions of the chips at around $200 each, or else price would have to be much, much higher -- like $500+.

There simply isn't enough demand or profit potential in making an extreme performance integrated graphics solution right now. Only custom designs that are basically guaranteed to sells tens of millions of units over time (Xbox, PlayStation, or Apple) can justify the cost.
This begs the question, why doesn't AMD just take one of their mobile APUs, put it in a desktop package, and just turn the clocks/tdp up on both the igpu and cpu.

The 7nm vega and 7nm CPU would provide a good improvement over the current 3200/3400g and most of the development costs are already done.
 
This begs the question, why doesn't AMD just take one of their mobile APUs, put it in a desktop package, and just turn the clocks/tdp up on both the igpu and cpu.

The 7nm vega and 7nm CPU would provide a good improvement over the current 3200/3400g and most of the development costs are already done.
Are you talking about Renoir? Because that is coming to desktops, but as with past APUs, it's a mobile-first chip. And there's a limit to what you can do with higher TDPs.

Basically, the chip and architecture are designed for a certain target, and even if you double the TDP you're not going to come anywhere close to doubling the performance. The ultra-mobile Zen+ APUs (eg, Ryzen 7 3700U) have GPU clocks of 1400 MHz with a 10 CU chip. Ryzen 7 3750H has the same 10 CU and 1400 MHz, even with 35W TDP vs. 15W. The desktop meanwhile has a 65W TDP but keeps the max GPU clock at 1400 MHz -- it just has 11 CUs instead of 10 CUs.

Based off that and what we see with something like Ryzen 9 4900H, the desktop models will probably have a maximum of 1750 MHz with 8 CU still. They'll just be able to maintain full clocks on the CPU and GPU instead of throttling one or the other (or both) to stay within TDP.
 

DavidC1

Distinguished
May 18, 2006
494
67
18,860
If Intel uses HBM for their future integrated graphic solution, could the high-speed VRAM also function as an L4 cache?

It can work like that in applications that need higher bandwidth, but unlike the eDRAM it doesn't have latency advantages so it won't be much faster for the CPU.

So the HBM would basically be for the iGPU and the high cost will have it competing with higher end dGPU solutions.

The Iris Pro 580 from 6th gen Skylake is still Intel's most power iGPU to date, still a bit more powerful than even the latest Iris Plus iGPU in Coffee Lake. But either way, even the Iris Pro 580 is still way behind the GTX 1050 in performance by all measures.

That's not true. Iris Pro 580, despite needing 45W and having eDRAM is about equal to the Iris Plus G7.
https://www.notebookcheck.net/Iris-...ics-G7-Ice-Lake-64-EU_7236_9866.247598.0.html

In the 7 games shown, the Iris Pro 580 is faster in 3 and slower in four.

580 wins in Overwatch, SC2 and Guild Wars 2. G7 wins in Rise of the Tomb Raider, MGS V, DoTA 2 Reborn, and Bioshock Infinite. You'll see in some games like Witcher 3, G7 will totally clobber the 580 because its a new uarch addressing the weaknesses of the older one.

Of course Jarred hasn't tested the Iris Plus G7.

Erm...maybe if they stopped making laptops 9mm thick?

This race to make everything wafer thin and wafer light has consequences you know.

Not only that. The Tablet/Smartphone world has emphasized portability. Even in the laptop world its starting to move to fanless devices. Even thinner and lighter devices are coming.
 
Last edited:
  • Like
Reactions: Chung Leong

Chung Leong

Reputable
Dec 6, 2019
494
193
4,860
Even if they'd theoretically make a 200 W integrated GPU it would run into even more cooling issues due to the heat from the CPU side.

Cooling issues could be more easily dealt with, I suspect, when the GPU sits inline with the CPU instead of being perpendicular to it. The PCI-E form factor is pretty limiting if you think about it.
 

jgraham11

Distinguished
Jan 15, 2010
54
21
18,535
Ryzen 5 3400G has other factors, namely it runs the PCIe with an x8 link width, but I still don't think it will make much if any difference. I could test with Pentium Gold G5400, or even a Ryzen 3 3200G, and I'm sure the GTX 1050 will be the bottleneck in most games. Which do you want, then: G5400, 3200G, or just stick with the 3400G? (I've swapped the CPU out already, so regardless of which I test, I'll need to change the CPU.)


Yes, generally speaking -- and the GTX 1050 is definitely hindered any time I run it in DX12 mode if DX11 is an option. Metro Exodus, Shadow of the Tomb Raider, Borderlands 3, The Division 2 all run better in DX11. So if you really want to see GTX 1050 performance in the best light, I'll test at 720p using the 'best' API as a comparison point. I haven't done that lately because often it doubles the amount of testing -- I need to check both APIs to see which is faster, and sometimes it's resolution/setting dependent as well. (That mostly only applied to ultra-fast GPUs, however, like RTX 2070 Super and above -- the GTX 1050 likely never benefits from DX12 vs. DX11.)


I didn't include 1080p testing because I only ran it on the Vega 11 and GTX 1050. Since this article was about integrated graphics, and since Intel's UHD 630 clearly can't handle 720p minimum settings in most games, trying to run it at 1080p medium settings either results in crashes, failure to run due to framerate requirements, or performance that's so slow as to be meaningless. Many games also choke if they fall below 10 fps -- everything slows down, physics gets wonky, etc. So a benchmark that takes 60-90 seconds normally might take five minutes or more at 5 fps. That's a lot of time wasted just to prove what is already known. So, including 1080p medium, I'd only have two results in the charts: GTX 1050 and Vega 11 3400G, which isn't very useful.

Jared, I would have preferred the testing originally done with something more comparable in price however, since the results are already posted, the 3400G would make the most sense to add on to the article.

I don't think the 8x link would matter either considering the 5500xt (much more powerful GPU) comes standard with only an 8x link.

Thank you for your other responses,

Jesse
 

watzupken

Reputable
Mar 16, 2020
1,030
521
6,070
I don't think Renoir will bring a tangible difference when it comes to integrated graphics as compared to the current Vega 11. Renoir's main strength is the use of Zen 2 CPUs, which core for core, will bring significant improvement in the CPU side of things. On the GPU, it actually regressed from Vega 11 to 8, though the higher clockspeed can somewhat compensate for the lost of the 3 CUs. Ultimately, the bandwidth limitations on APUs (shared resources between CPU and GPU) will cap the performance gain. With Zen 2, there is the possibility to utilize faster RAMs, i.e. faster than 3600Mhz, but those RAMs are pricey, and will require one to fiddle with the infinity fabric clockspeed so that you don't lose performance.
 
  • Like
Reactions: TJ Hooker

ManDaddio

Reputable
Oct 23, 2019
99
59
4,610
Are we forgetting who the APU market is? These are people on a budget, that just need a computer, they don't need the bleeding edge. I have built, or upgraded, a number of computers over the years for people with very basic needs. The most demanding games that they play is the variations of solitaire, or an updated version of minesweep. Sorry, but playing solitaire at 144 fps on a 52 inch 4k monitor is not something the target market of these APUs are interested in. Post to Facebook, read ads on Craigslist, respond to emails, fire up a spreadsheet or word processor in Microsoft Works (yes, Works, not Office), maybe watch a puppy or kitten video on YouTube because their grandchild sent them a link in an email (after you explain how to click on that link in the email to be able to view that video).

So, who is up to playing Spider Solitaire on the 72" 240 Hz 8K monitor with the $200 computer I will custom build for you with these thrilling APUs?
Well said. I don't always agree with TH but this article was a good opinion. I was/am still tired of people recommending AMD integrated chips as if they are even great cheapy mainstream gaming chips. Plus in my experience AMD laptops are not great quality whether the CPU is more "bang for buck" which again is debatable depending on what you use it for. Most of my AMD laptops compared to the Intel/Nvidia ones are junkie in my opinion. Wear fast and pieces fall off. BUt again that is my experience.
 
  • Like
Reactions: JarredWaltonGPU
Well said. I don't always agree with TH but this article was a good opinion. I was/am still tired of people recommending AMD integrated chips as if they are even great cheapy mainstream gaming chips. Plus in my experience AMD laptops are not great quality whether the CPU is more "bang for buck" which again is debatable depending on what you use it for. Most of my AMD laptops compared to the Intel/Nvidia ones are junkie in my opinion. Wear fast and pieces fall off. BUt again that is my experience.
And that has nothing to do with AMD, rather Intel making sure AMD never gets into any premium designs, perhaps through paying people off.

AMD's 4th gen mobile CPUs destroy everything intel has to offer in mobile. AMD beats Intel in games and most creative workloads. In addition to superior performance, AMD draws less power and outputs less heat, which is good for a laptop.

However, have fun finding a high end AMD laptop with a high end GPU. It doesn't happen. There is a reason for this...
 
Last edited:
  • Like
Reactions: King_V
And that has nothing to do with AMD, rather Intel making sure AMD never gets into any premium designs, perhaps through paying people off.

AMD's 4th gen mobile CPUs destroy everything intel has to offer in mobile. AMD beats Intel in games and most creative workloads. In addition to superior performance, AMD draws less power and outputs less heat, which is good for a laptop.

However, have fun finding a high end AMD laptop with a high end GPU. It doesn't happen. There is a reason for this...
It has less to do with Intel and more to do with AMD not having a really great laptop solution prior to Renoir. Because when you start with a bunch of disclaimers like weaker battery life, that inherently limits the appeal. Now combine that with more than a decade of underwhelming AMD laptop solutions, and that's hard to overcome.

Athlon branded laptops were perceived as "cheap junk" for a long time, because they were cheap junk. That led to a perception that AMD just wasn't a good choice in anything besides a budget laptop -- and the higher end laptops that have used AMD CPUs in the past have traditionally done very poorly. Also, businesses are still mostly Intel-only shops, and businesses buy a lot of laptops.

But Renoir is making some inroads -- just look at the Asus Zephyrus G14, which is pretty much everything I could want from a laptop ... except I'm an older and bigger guy so I'd rather have a 15.6 inch display (1080p, thanks -- my eyes don't need a stinking 4K or even 1440p laptop display!)

IF AMD can continue to execute and deliver better than Intel performance on CPU and GPU and battery life, the laptop makers will take notice. But it will probably take two or three years of solid AMD laptop solutions before we start to see more than a small handful of no compromise high-end AMD laptops.
 

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
It's all about money for sure -- but not the way you're suggesting. No one has successfully made a fast iGPU solution for PCs. You're saying Microsoft paid off not just AMD, but every other company that might think about doing such a thing. Not a chance. And equally unlikely MS was able to pay AMD off.

Easiest way to disprove that assertion: If MS could pay AMD to not make a faster iGPU for PC ... wouldn't it make far more sense to pay AMD to not make such a thing for Sony? Microsoft isn't worried about competition from PCs killing the market for Xbox at all. It hasn't been a problem since the original Xbox; why would things change now?

And $50 for 4GB of HBM2 is extremely expensive on a component level. I gave math earlier, but basically a chip like Picasso costs AMD around $35 to make, and Renoir is maybe $50-$60. So if AMD put a $60 chip with $100 of HBM2, it would need to either sell tens of millions of the chips at around $200 each, or else price would have to be much, much higher -- like $500+.

There simply isn't enough demand or profit potential in making an extreme performance integrated graphics solution right now. Only custom designs that are basically guaranteed to sells tens of millions of units over time (Xbox, PlayStation, or Apple) can justify the cost.

MS dont need to pay AMD off , they just order millions of chips on the condition nothing like it is released into the desktop market.

and big no $50 more for HMB2 is not that expensive when you give me APU that runs at least 6 Tflops .. and integrated GPU is ALOT cheaper than GPU cards which have more components on the card plus the defect rates in making Cards .

that $50 more for HBM2 PLUS the APU price will still be cheaper than a 6Tflops GPU + CPU no matter how you put it .. and add to it the bonus of compactness and lower power and heat etc...

$500 ??? sorry thats your Overpriced estimation based on nothing ... and dont worry , IF AMD decides to higher the onboard GPU Tflops to be around 6Tflops at minumum like last Gen Xbox , or upto 12 Tflops like the new one , then they WILL sell Millions and they will make Intel CPU with onboard GPU a Joke.

few years ago I put my ideas about how the CPU market is cheating us in Prices and everything was OverProiced , I got flamed here like crazy form every one , when AMD stepped in , Intel showed us they were stealing us when they lowered their CPU prices to Half. but back then ? you all talk the same .. trying to justify monopolies pricings.

sorry does not work any more , actually there is another reason why AMD is not releasing powerfull APU to the Market , it will hurt their GPU cards badly , no one will buy them because AMD is not good at the moment in the flagship , Nvidia is beating them in the high end ... AMD GPU success are all in the midrange cards which is a direct competitor to the 6-12 Tflops APU that are in consoles ... so AMD will lose the GPU card market if they sell fast APU.
 
Last edited:
MS dont need to pay AMD off , they just order millions of chips on the condition nothing like it is released into the desktop market.

and big no $50 more for HMB2 is not that expensive when you give me APU that runs at least 6 Tflops .. and integrated GPU is ALOT cheaper than GPU cards which have more components on the card plus the defect rates in making Cards .

that $50 more for HBM2 PLUS the APU price will still be cheaper than a 6Tflops GPU no matter how you put it .. and add to it the bonus of compactness and lower power and heat etc...

$500 ??? sorry thats your Overpriced estimation based on nothing ... and dont worry , IF AMD decides to higher the onboard GPU Tflops to be around 6Tflops at minumum like last Gen Xbox upto 12 Tflops like the new one , then they WILL sell Millions and they will make Intel CPU with onboard GPU a Joke.

few years ago I put my ideas about how the CPU market is cheating us in Prices and everything was OverProiced , I got flamed here like crazy form every one , when AMD stepped in , Intel showed us they were stealing us when they lowered their CPU prices to Half. but back then ? you all talk the same .. trying to justify monopolies pricings.

sorry does not work any more , actually there is another reason why AMD is not releasing powerfull APU to the Market , it will hurt their GPU cards badly , no one will buy them because AMD is not good at the moment in the flagship , Nvidia is beating them in the high end ... AMD GPU success are all in the midrange cards which is a direct competitor to the 6-12 Tflops APU that are in consoles ... so AMD will lose the GPU card market if they sell fast APU.
You don't get how things work if you think a component cost of $50 is small. That's the raw material cost. You have to integrate that with other components (which costs money), pay for R&D (lots of money), and still make a profit -- not just you, but your partners. So a chip that has a raw cost of $150 either needs to sell a LOT of units at a price of around $225, or it needs to sell a decent number of units at a price of $300, or it can sell a smaller number of units at $500. Economies of scale.

AMD's current Renoir CPUs probably cost $60 to make. How much does AMD charge? Anywhere from $150-$300 probably, depending on who's buying and the volumes involved. If you have a chip design that costs three times as much, guess what: the end price needs to increase 3X to compensate. (Assuming you sell fewer chips because the cost is higher -- theoretically, if you have the same total number of sales, then it's a different story.)

AMD makes a chip for $60. It has to cover R&D and other expenses, so the real cost is more like $120. Then it needs a profit, so it sells for $150-$200 (or more if it can). Then the company that bought the chip has to make a profit, as do the distributors and retailers. Which ultimately makes the end user price MUCH higher than the original $60.

The reason no one makes a console-like "APU" is that they can't guarantee the numbers to make it profitable. Xbox or PlayStation are basically guaranteed to do tens of millions of unit sales over the first few years, probably 100 million or more by the time the console is ready to be retired.

And MS and Sony also make a decent chunk of money off licensing fees for game developers to sell games on their consoles. PC Games don't have to pay AMD money, or Nvidia, or anyone else (other than Steam or Epic or whatever). So MS can take a console that costs $500 in raw materials and sell it for $450 and still come out ahead in the long run. If AMD makes an APU that costs $150 in raw materials and sells it for $150, it will lose money and have no way to make that back.
 
  • Like
Reactions: TJ Hooker

Chung Leong

Reputable
Dec 6, 2019
494
193
4,860
The reason no one makes a console-like "APU" is that they can't guarantee the numbers to make it profitable. Xbox or PlayStation are basically guaranteed to do tens of millions of unit sales over the first few years, probably 100 million or more by the time the console is ready to be retired.

Intel can probably do it, since they wouldn't have to turn a profit on the product. Achieving economy of scale is more important. Money will come in from the enterprise side.
 
Intel can probably do it, since they wouldn't have to turn a profit on the product. Achieving economy of scale is more important. Money will come in from the enterprise side.
Lots of companies could do it, but none of them are doing it because they don't see a demand for such a product. That's the simple truth. Enterprise wants either a laptop chip and who cares about gaming -- which is what all current Intel chips provide, and it suits enterprise users just fine -- or else they want massive compute for HPC environments, in which case integrated graphics is a stupid idea. Intel will have Xe HP and Xe HPC for data centers this year (or next for Xe HPC?), and it will have laptops with serviceable graphics with Tiger Lake. That satisfies pretty much all the demands of enterprise users.
 
  • Like
Reactions: TJ Hooker

King_V

Illustrious
Ambassador
$500 ??? sorry thats your Overpriced estimation based on nothing ...

As opposed to the following?
MS dont need to pay AMD off , they just order millions of chips on the condition nothing like it is released into the desktop market.

If there's anything here that's based on nothing, it's coming from you.

In other words, with regard to your MS claim: Citations, please.
 
  • Like
Reactions: TJ Hooker
and big no $50 more for HMB2 is not that expensive when you give me APU that runs at least 6 Tflops .. and integrated GPU is ALOT cheaper than GPU cards which have more components on the card plus the defect rates in making Cards .
How much is this APU with HBM2 memory supposed to cost? It's been possible to get ~6 Tflop graphics cards for a little over $150 for quite a while now. The 4GB RX 480 launched for just $200 four years ago while offering nearly 6 Tflops of compute performance, which was multiple times the graphics performance of the PS4 and Xbox One that came out just two and a half years prior, and roughly on par with the raw graphics performance of the "4K" consoles that didn't come until later. After this next generation of consoles launch, it likely won't be too long before relatively inexpensive graphics cards are significantly outperforming them as well.

Between the HBM2, the graphics chip, and the better cooling necessary to keep temperatures in check (A Wraith Prism, at the very least), the price increase over a comparable processor without graphics would be just about on par with simply buying a dedicated card of comparable performance. And of course, if you want to upgrade your graphics performance down the line, you would need to replace your CPU cores as well if you wanted to maintain the same form factor. And that's assuming the motherboard is still compatible with the newer APUs being released a few years down the line.
 
  • Like
Reactions: TJ Hooker

Awev

Reputable
Jun 4, 2020
89
19
4,535
My domestic cats have nightmares about rats the size of dogs, while tigers dream about deer. They are both in the feline family, yet different animals. Game consoles and personal computers are both in the micro processor family, yet they have different roles to play.

A game console is a closed system, and so the Operating System (OS) can be gutted. While the components can operate slower the perception is that it is faster, it doesn't have all of the bloatware that a desktop operation system has (even compiling your own OS, such as Gentoo Linux, helps - there is still a bunch of code, and hence overhead). And while my printer has a USB slot it is limited, doesn't do music or an ethernet connection through it, nor a lot of other things that you can use an USB slot for, only pictures and PDFs, while the printer does have a small color LCD display - just don't try playing games on it.

About 22 years ago I worked retail in the electronics department of a large nation wide retailer. This was around the same time that HD was making in-roads, and rear projector televisions where the main way to watch something on a 52"or 60" screen. On a night I had off a co-worker had a customer, who had $500 eyes. He came in with his wife, and could not tell the difference between a $500 SD (really it was just a high-end analog, yet I mention SD so you have some idea as to picture image) TV and a $2,500 top of the line HD TV - both 32 inches in size. He could not tell the difference, and so he could save $2,000 by going with the $500 TV, hence the $500 eyes.

Lets keep the article in prospective. It is about about integrated graphics on a CPU, also commonly referred to as APUs, and setting base lines for comparison in the future. There is no question that a game console will do better than iGPUs, while dGPUs are the King of the Hill. And while it is a nice idea to take some hints from game consoles to make iGPUs better, we are talking two different animals, and it just doesn't work that way. And it is likely that most of us on this forum will use a dedicated graphics card, it is still nice to know how the integrated ones stack up so if you are like me and build a system for family and friends with basic requirements you know what is available.
 
  • Like
Reactions: King_V

watzupken

Reputable
Mar 16, 2020
1,030
521
6,070
I agree. I think APU is APU, and honestly, it serves the purpose for the market that it is targeting. Target market are likely budget gamers, or people who don't want a dGPU because of their usage. Yes, if it can go faster, I would like it too, as long as it doesn't result in significantly higher cost. If the cost approaches a CPU + low end GPU, then it kind of defeats the purpose.

If you are looking for higher performance, it makes sense to just grab a dedicated graphic card instead. Consider this, if AMD or Intel is to offer you an APU that cost almost as much as you buying a CPU and dGPU, would you be willing to buy it? The previous Intel CPU + Vega GPU combo may be a good indication of the price to expect.
 
About 22 years ago I worked retail in the electronics department of a large nation wide retailer. This was around the same time that HD was making in-roads, and rear projector televisions where the main way to watch something on a 52"or 60" screen. On a night I had off a co-worker had a customer, who had $500 eyes. He came in with his wife, and could not tell the difference between a $500 SD (really it was just a high-end analog, yet I mention SD so you have some idea as to picture image) TV and a $2,500 top of the line HD TV - both 32 inches in size. He could not tell the difference, and so he could save $2,000 by going with the $500 TV, hence the $500 eyes.
What system was being used back in ... 1998? I can't recall seeing any actual HDTV (720p or above) sets until the early 00s, though I admit I wasn't living on the cutting edge of TV setups. I do remember my roommate in college buying a $2500 Sony Wega Trinitron TV at one point. I can't remember the size or even if it was widescreen, but I'm pretty sure it was still SD -- this would have been around 1997-1998.

But those early years of HD broadcasts were often terrible -- about the only thing broadcast as a true HD (720p) recording even in the mid-00s was sports. I bought a 46-inch HDTV in 2004 or 2005, for about $1200 -- and that was a cheaper non-DLP setup. (DLP was $2500+ still.) I was so excited to try it out, and then found out that ABC HD, NBC HD, CBS HD, Fox HD, and even ESPN HD were 95% SD broadcasts just upscaled to HD -- with the black pillarboxing on the sides unless you stretched the image. But dang those football and basketball broadcasts made me happy! LOL

Anyway, in the 90s the guy definitely would have saved money by sticking with SD. Large 60-inch HDTVs even in the early 00s were more like $20,000 for a year or two after launch. And those were early plasma TVs that were extremely prone to burn-in. Heck, my 46-inch rear projection TV still go burn-in after a few years -- thanks to all the ESPN / Sportscenter viewing I did.