Because everyone hates it when their whole monitor isn't being used, but then again don't use DLSS, FSR or XESS. TV's have hardware upscaling built in that looks far better than any AI implementation. The solution is to use your TV as a monitor. especially now that (some) TVs have VRR and 120hz support.
Yep, this is true. Back in the day, when I had to reduce my resolution to 1080p to play AC:Odyssey on an R9 Fury, it struck me that it didn't look any different from 1440p. I tried using the benchmark on Far Cry 5 and I couldn't tell the difference between 720p and even 4K! I was looking closely at key factors on the screen like foliage and water quality. I was less than 30cm from the screen looking for differences and I honestly couldn't find any.
I spoke to Jim from AdoredTV about it and he was as baffled as I was. I did a bit of research and discovered that big-screen 4K TVs have hardware upscalers on them because the majority of television broadcasts at the time were 480p (DVD-Quality) with the odd station having 720p. In order for these broadcasts to not look
absolutely terrible like they did on the old projection sets from the 80s and 90s, upscalers would clean the image up so that it looked as clear on a 55" as it would on an old 20" CRT.
Jim agreed that this had to be the reason because on a regular monitor, the differences would be in stark contrast of each other.
They are used mostly in the HTPC world where you want the ability to do "light" gaming in your living room, watch movies or do other computer related things. Intel's IGP's aren't strong enough for anything past movies, while the APU's are just strong enough for actual game play. My working theory on why no decent 7 series APU's is that they didn't want to compete with their own GPU market. So lets see what happens next generation.
That's a good theory too. Time, as always, will tell. (Stolen from GamerMeld)
Yeah I'm really hoping Intel takes over the abandoned low / midrange market.
How is the market abandoned? Isn't the low/midrange market the domain of the RX 7600 and RTX 4060? As far as I can tell, only the bottom-end of the market has been abandoned this generation, the domain of the RX 6400, 6500 XT (I still don't know what the XT is for), the RTX 3050 and GTX 1630/50/60. I'm thinking that the major players expect that the last-gen cards are "good enough" for this market segment. I'm not saying that I agree with them (because I don't), but I do believe that's their line of thinking.