Gaming in 4K: i7-8700k + GTX 1070 or Xbox One X?

mceja89

Distinguished
Aug 30, 2011
5
0
18,510
I hadn't seriously considered gaming in 4K until I was recently gifted an Xbox One X. My question is will the Xbox One X deliver a 4K gaming experience that is decidedly superior to my PC? Ideally I'd be using a 4K TV display such as the LG OLED55B7A.

My specs:
ASRock Z370 Extreme4 motherboard
Intel i7-8700k cpu
32 GB DDR4 ram
MSI GTX 1070 gaming x

Thank you for reading, any input would be greatly appreciated!
 
Solution
First off, if you don't have the 4K TV yet, know that it won't display HD TV broadcasts (1080i/720p)as well as a TV with native 1080p res. I learned this by making the mistake of buying a Sony 49" X900E, one of the best rated 4K TVs in it's price range. Salesmen, the media, and the public in general are really over hyping 4K to the point of literally scraping it's glaring flaws under the rug.

Secondly, there's not nearly enough available 4K content yet to warrant buying into it, especially for those on a budget. I've read that only about 17% of US citizens have a 4K display so far. The UHD broadcast standards (ATSC 3.0) are nearly finalized, but it won't be until early 2019 before TVs in the US start getting ATSC 3.0 tuners. It will...
The GTX 1070 will do native 4K at 30FPS more consistently than the Xbox One X. If you want 60FPS that would require reductions and graphical settings and likely having to drop down to something like 1800p to keep the framerates playable. The One X is mostly locked to 30FPS too, but most games aren't actually rendered at native 4K, they're running at around 1800p, at best only doing native 4K some of the time when GPU load is light with dynamic resolution scaling. The only games that are running natively at 4K without dipping below that resolution frequently are less graphically demanding titles like Forza or Killer Instinct.
 
First off, if you don't have the 4K TV yet, know that it won't display HD TV broadcasts (1080i/720p)as well as a TV with native 1080p res. I learned this by making the mistake of buying a Sony 49" X900E, one of the best rated 4K TVs in it's price range. Salesmen, the media, and the public in general are really over hyping 4K to the point of literally scraping it's glaring flaws under the rug.

Secondly, there's not nearly enough available 4K content yet to warrant buying into it, especially for those on a budget. I've read that only about 17% of US citizens have a 4K display so far. The UHD broadcast standards (ATSC 3.0) are nearly finalized, but it won't be until early 2019 before TVs in the US start getting ATSC 3.0 tuners. It will take at least 5 years before UHD broadcasts are prevalent enough to warrant having a 4K TV for actual TV use.

So, if you at all care about actual TV, I would just use the very nice gift of One X for now. As to your question about PC GPU power for 4K, it takes much more than a 1070. I was using a EVGA 1080 Ti FTW3 when testing 4K gaming, which is pretty much THE most powerful card you can get at a sub $800 price range, and it would not play games like Far Cry 4, Just Cause 3, or Ghost Recon Wildlands at more than 45 FPS or so.

Plus Windows 10 does not play well with 4K. It scales the fonts to be more readable, but in doing so it also makes some program's GUIs all messed up. If you choose to use a lesser percentage of font scaling, then it becomes more readable, but fonts can be too small to easily see. I also found lowering the res, even on the Sony Bravia models that have best in industry scaling and up converting, the image quality was horrible.

Then there's the HDR problem. Nvidia seems to want to cling to MS' HDR through Windows, but it's broken. Windows HDR only works in 422 color output, but it also makes the desktop all washed out. Mass Effect Andromeda was the only game I was able to test in this type of HDR, but I had to have the HDR enabled in Windows, but disabled in the game. It looked fantastic, but unfortunately caused the game to crash every time I hit Esc, so I couldn't play it.

The real irony, is despite AMD not being very competitive with their GPU hardware lately, they've actually got the HDR thing figured out better than Nvidia. What they do is bypass the Windows HDR, and use their own, and from what players whom use it are saying, it works consistently.

I do know that HDR on One X works though, so it's pretty odd MS can't get it to work on Windows. 4K eventually will be awesome for TV, Gaming, and Desktop, but for now it's not at all ready for prime time.
 
Solution
You will need at least 1070 ti to get native 4k experience over stable 30 fps, and here is the article you should read:
https://techgage.com/article/nvidia-geforce-gtx-1070-review-a-look-at-1440p-4k-ultra-wide-gaming/

If you only want to play old games (before 2016) with 4k resolution, even GTX 970 can handle most of them with high or mid setting.

If you are NOT planning to play STG games or PC exclusive, go get XBX X instead.
 
Also pay attention to motion comparisons (30 FPS vs 60 FPS), because the fact is, a lot of TVs exhibit much more noticeable motion blur at 30 FPS vs 60 FPS. This means all those extra pixels will have the tradeoff of distracting and annoying blur.

There are sites that show what your 4K TV will look like at 60 FPS and 30 FPS, like this one...

https://testufo.com/
 

Not completely true. Yes this can be the case with the entry to mid range 4k TV's but some mid range and most higher end 4k TV's of recent models shouldn't suffer. I have the 55B7V and it is just superb at all resolutions I have thrown at it putting my old 1080p to shame.
 


It's even more true with today's TVs regarding mainstream models that the vast majority of people can afford. The TV industry is falling apart basically. Panasonic went into financial trouble and had to sell in select sizes and select countries using LG panels, and even LG's own TVs, like the 6300, have issues with build quality and the motion control feature not working.

The fact that you give an example of an OLED model that costs over 1300 pounds ($1619 in USD) even on Black Friday kinda proves my point, and OLED btw is no winner either. You get a great picture when they're new, but they're all susceptible to burn in, and their blue pixels only have an expected lifespan of 20,000 hrs due to deterioration. That's 1/3 the life of the average LCD screen. When they deteriorate the screen gets darker, and OLEDs lack brightness compared to other tech to begin with. You couldn't have demonstrated more with you post that they are turning the TV industry into a rich man's game, where it's all about expensive, disposable TVs anymore.

Regarding 4K, the only reason 4K TVs are selling so well is because the manufacturers are forcing them on the public, they have no other choices anymore in new TVs.

LG and Samsung are the worst offenders at the demise of the TV industry. They are the ones only putting quality into their ultra expensive models. They are both known for producing flimsy models that can suffer shattered screens even when transported with no damage to the outside box. They are built and packed horribly. Just look at customer reviews of the LG 6300 models. The 6300s are also being sold as 4K TVs, when in fact they use RGBW color format, which uses a white pixel every 4th one, so technically it's only 3K. This is the kind of deception LG has been using for many years, as well as panel lotteries. Meaning they've also been known to have a mixed bag of IPS and VA panels on prior models that were supposed to be IPS.

Well before OLED even became popular among the rich, LG said on Celebrity Apprentice on a task they oversaw for an ad competition for their large screen TVs that their demographic was customers whom make $74,000 a year or more. LG built it's empire on the poor with affordable products, and now all they do is take a crap on them with shoddy build quality and horrible customer service.

But to respond to your point, yes, OLEDs don't have motion problems like LCDs, but they have other problems that make them equally undesirable, if not more so.
 
For 4k your CPU and ram is overkill. The gpu is the main bottleneck at 4k. There's a reason why 4k is called the "equalizer", it makes no difference at this point if you had a high end cpu or a mid range because the gpu will hold you back before the CPU. My 1080ti is holding back my 6600k. So, if you had a cheaper previous gen i5, half the ram and a better gpu and would still have better fps at 4k. If a 1080ti is out of your reach possibly consider selling some parts (at least the 1070). 16gb is more than enough ram for today's games. As you can see here is not even a 1fps difference between all the cpus at 4k (3840x2160) on multiple games.
https://www.techpowerup.com/reviews/Intel/Core_i5_8600K/12.html
I would say try your best to upgrade your gpu even if it means selling parts. I used to have a 1070 and I can say it left a lot to be desired after I upgraded to my 4k screen. 1080ti has been great and I wouldn't go any lower than that because even a 1080ti struggles at times. Also make sure to overclock it and you are going to want a model with good air cooling or preferably water cooling.
As for the Xbox one, it doesn't deliver any kind of 4k experience whatsoever. The hardware inside is nowhere near capable. It's all 1080p upscaled fake 4k. Technically this looks better than 1080p which is why people are buying into it but it's nowhere near as good as actual 4k.