4K at 60 FPS possible? Would this rig an overkill?

Tomimimi

Reputable
Jun 18, 2015
19
0
4,510
Hello,

I would rather build a pc which can handle 1440p at ultra setting at 60fps than 4k at ultra setting at 30~40 fps.

Is 4K at ultra setting at 60FPS possible?

If not, would the following rig capable of running 1440p at ultra setting at 60FPS, or are these an overkill?

- i7 5280k
- GTX980 Ti
- X99 series mobo
- 16GB RAM / 500GB SSDs and extra extra~

Your help will be so much appreciated!
Please help this noob pc builder and game enthusiast play some games... I always watch youtube clips.. TT. TT now I have budget, please advise me!!

Thanks!

Tom
 
Solution
A single 980Ti isn't going to give 60fps, quite a few benchmarks out there, in modern games 35-45 seems about average. Dual 980Ti would be the way to go.

Also is this just a gaming build, if so the only reason to go X99 is if you need the extra pcie lanes for 3 or 4 way sli. If dual sli or under the 4790k is just as strong and a heck of a lot cheaper.
A single 980Ti isn't going to give 60fps, quite a few benchmarks out there, in modern games 35-45 seems about average. Dual 980Ti would be the way to go.

Also is this just a gaming build, if so the only reason to go X99 is if you need the extra pcie lanes for 3 or 4 way sli. If dual sli or under the 4790k is just as strong and a heck of a lot cheaper.
 
Solution

Tomimimi

Reputable
Jun 18, 2015
19
0
4,510


But according to benchmarks, even gtx980 ti sli cannot handle 4k at 60fps, i remember...

I always dreamt of playing Star Citizen-like games at 4k 60fps. That's not possible is it.??
 

DasHotShot

Honorable
4k at 60 FPS is tricky with a single GPU for most AAA titles.

The problem is also, the visual difference between 1440p / 4x AA and 4k is very modest and hard to spot.

Add to this the fact you are rendering almost double as many pixels at 4k with little benefit visually, it makes for inefficient use of GPU power. I would recommend aiming for HIGH FPS 1440p as it is the perfect experience IMO for now.

Also, you should consider a FURY X over the 980 TI for High res gaming as their tech and HBM memory make really light work of these resolutions and settings. It is the way forward and that is coming from an Nvidia fan!
 

RobCrezz

Expert
Ambassador


Any benchmarks of this backing it up?

 


Anything to back this statement up. Currently my view is Sli has fewer driver issues and when there are issues they are fixed much quicker. As I haven't seen any benchmarks for the Fury I am not sure anyone can recommend, both Nvidia and AMD have been known to put great marking spin's on upcoming improvements that turn out to be nothing worth noting, the only thing that counts are real results
 

DasHotShot

Honorable


https://techreport.com/news/28501/here-a-first-look-at-the-radeon-r9-fury-x-performance?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+techreport%2Fall+%28The+Tech+Report%29

These aren't third party, however an early indicator.

I will happily back up more when we see those reviews in.

Also I am basing my statement on pure technological facts. AMD's new cards lend themselves to handle high res better from a Vram and pure rendering perspective as the Fury X doesn't render 400 frames and use the 144 or 60 good ones...whereas 980 Ti does exactly that.

More Vram and more of everything isn't the answer necessarily
 

Jonathan Cave

Honorable
Oct 17, 2013
1,426
0
11,660


I have my 980 Ti on order and i may cancel it and order a Fury X - when you mention that the Fury x won't render unused frames - can you elaborate a little further please? - I understand that with vsync on or using an variable refresh rate monitor the GPU renders what it needs and no more ?


 

DasHotShot

Honorable
Vsync/Gsync/Freesync is there to match rendered frames to the Screen refresh rate. So it only makes sure that the gameplay looks smooth.

What AMD have introduced, is technology which essentially allows the GPU and CPU to communicate together and figure out how many FPS are necessary for any given scenario and only render these.

Until now, GPU's would render quite a lot more frames (for the 980 Ti that could sometimes be up to 400 FPS for a 60FPS scene) and just pick out the required ones, effectively discarding the rest. This meant, especially in high res situations, that GDDR was taken up to store assets, rendered at a cost to compute power, leading to this 12GB nonsense from Nvidia on the Titan X and then discarded.

More isn't more, if you are producing waste...It is like renting an extra warehouse to store excess stock which you don't plan to sell.

By streamlining the process and only rendering what is required, you save resources like energy (hence the lower power consumption from the chips themselves), compute power and the need for load of Vram. By increasing the bandwidth hugely on that and introducing this software feature, AMD has LEAPED ahead in terms of efficiency and allowed a look into how the GPU market of tomorrow will look.

Their 4GB solution, according to their official benchmarks (and some synthetic ones by others) trumps Nvidia's 6GB and 12GB solution, by simply being more efficient and targeted.

It is a similar evolution as we saw in phone manufacturing. ARM revolutionized what is possible there through power efficient, targeted hardware, decreasing space required and batter power required...leading to today's smartphones.

AMD have given us the "concord" moment in GFX effectively, by really dragging us into a new age.
 

DasHotShot

Honorable


What makes you say that?

Their Fury Nano is 2x more powerful than a 290x and needs 175w (one 8 pin connector)

Even the flagship cards with watercooling solutions which draw loads use as much as a Titan X...
 

RobCrezz

Expert
Ambassador


One 8 pin connector doesnt automatically mean 175w. AMD have had cards that draw more than they should based on the connectors, look at the 295x2, 2x 8 pin connectors but draws up to 470w...

No point guessing at this point, wait for the figures.
 

DasHotShot

Honorable
Huh?

The 295x2 was 2x290x glued together...what did you expect it would draw?

Also they released TDP figures and acceptable power draw band for OCing:

Fury Nano: 175w up to 250w
Fury X: 275W up to 375w

It is the watercooled XT thzat will draw loads because it has a closed loop watercooler bolted on...As such I don't see it as being WAY behind. Don't get me wrong, I am loyal to Nvidia and use their gpus however we have to accept some facts and figures here
 

Tomimimi

Reputable
Jun 18, 2015
19
0
4,510


Hey DasHotShot, thanks for your reply mate.

Okay, I'm not a tech geek, so I do not have much knowledge.

I understood that 4k ultra 60fps is impractical today.

But, if I want to play any games at ultra 60fps on 27' monitor, which resolution would that be???

I'm just so confused about resolution and monitor size on top of enough complexity regarding GPUs and everything...

Can you please spare some of your time and advise me on which monitor + PC parts I should buy?

 

DasHotShot

Honorable
Well it all depends.

The other experienced colleagues here will also give their view I am sure and the great thing is we all have a different view.

PC gaming is a science to a degree, however it is so diverse in terms of components and actual gameplay options, that it is first important to set one's expectation. As you have detailed yours, I would suggest in the following parameters:

The size of the monitor doesn't determine the resolution. A 27 inch screen can be 1080p (Full HD), 1440p (2k) or 2160p (Quad HD/Ultra HD/ 4k). On top of this you can have 60Hz refresh (meaning you can't see more than 60FPS and may aswell just have Vsync on or you can have 120hz, 144Hz or the best possible option of Gsync(Nvidia) or Freesync (AMD).

The last two communicate with the GPU and scale the displayed FPS at all times, giving super smooth gameplay.

To run a 1440p 144Hz experience you are already at the highest end of the spectrum for hardware requirements and entering the enthusiast level. This is really costly, doesn't represent bang for buck and sets out to give the highest possible visual experience, no matter the cost.

Components reqquired, reflect this: Powerful quad core or better Intel CPU, 8GB of ram (16GB can be nice for ramdisc space), at least a single 980Ti or better etc etc.

if you settle for a lower resolution and opt for high FPS (good for first person shooter games) you can probably lower those specs and still get a very high end experience.

In the end it all comes down to budget.

So why don't you give us a number in your local currency and which country you live in (which country you are buying in). Then we can all make suggestions and you will have a really good choice of good advice?
 

RobCrezz

Expert
Ambassador


I expected it to draw that, my point was that you cannot guess their power usage from the connectors used - two 8 pins isnt within specs for that amount of draw.

Watercooling doesnt make it draw more power I dont know why you think that?
 

DasHotShot

Honorable


To operate a cooling loop and a 120mm fan on the radiator draws more power than just a small fan or set of fans on an air cooler.

I also went by what AMD said about their reference cooler's power draw and what other similar coolers on Nvidia cards draw.

If you read up a little on the FUry release you will stumble across most of the info I have given you. Also, regardless of whether you trust them, the manufacturer should be used as an official source, until tests in real world scenario context are performed.

It's not like benchmarks and power tests will give us results that sway off the mark by 50 or 100%, is it?
 

Tomimimi

Reputable
Jun 18, 2015
19
0
4,510


Thanks for the advise,

I'm South Korean living in Sydney Australia, so AUD is the right currency.

For your reference, GTX980Ti costs approx. $1100AUD here.

After reading information you provided, I can give you exactly what I demand:

1. PC capable of running 1440p (2k) at 27 inch monitor with G-sync or Freesync depends on GPUs.

2. Dual Monitor!!

3. Capable of dealing games at ultra setting at 60fps with no difficulty!!

Please help... : ) much appreciated!!!! Such a good forum it is!!
 

DasHotShot

Honorable
Well...That is very high end...

Here is a suggestion for the PC components including windows, excluding the screens, peripherals etc.

You can go X99 or Z97 chipset, with the latter being cheaper and showing similar performance as a i7 5820 minus 2 cores...it's a toughy. What do they others think?

Personally I would also wait to see what AMD cards are like on release and see if Fury XT is maybe the better option.

PCPartPicker part list / Price breakdown by merchant

CPU: Intel Core i7-4790K 4.0GHz Quad-Core Processor ($459.00 @ Centre Com)
CPU Cooler: Corsair H100i 77.0 CFM Liquid CPU Cooler ($157.00 @ CPL Online)
Motherboard: ASRock Z97 EXTREME4 ATX LGA1150 Motherboard ($199.00 @ CPL Online)
Memory: G.Skill Ripjaws X Series 8GB (2 x 4GB) DDR3-2400 Memory ($82.00 @ IJK)
Storage: Samsung 850 EVO-Series 250GB 2.5" Solid State Drive ($158.00 @ Centre Com)
Storage: Western Digital Caviar Blue 1TB 3.5" 7200RPM Internal Hard Drive ($69.00 @ CPL Online)
Video Card: EVGA GeForce GTX 980 Ti 6GB Superclocked+ ACX 2.0+ Video Card (2-Way SLI) ($1129.00 @ CPL Online)
Video Card: EVGA GeForce GTX 980 Ti 6GB Superclocked+ ACX 2.0+ Video Card (2-Way SLI) ($1129.00 @ CPL Online)
Case: Corsair Carbide Series 300R Windowed ATX Mid Tower Case ($118.00 @ CPL Online)
Power Supply: EVGA SuperNOVA 1000G2 1000W 80+ Gold Certified Fully-Modular ATX Power Supply ($249.00 @ PLE Computers)
Optical Drive: Samsung SH-224DB/RSBS DVD/CD Writer ($21.99 @ Mwave Australia)
Operating System: Microsoft Windows 8.1 OEM (64-bit) ($135.00 @ CPL Online)
Total: $3905.99
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2015-06-23 00:32 AEST+1000
 

Tomimimi

Reputable
Jun 18, 2015
19
0
4,510


Thanks!!!

But... I would need to SLI GTX980 Ti!!? That is hella money haha Thanks for the information, though!!

I also would like to wait and see what AMD's new product line can offer.

It is releasing tomorrow 24 June, right?

Would you mind if I private message you after the release to ask you what do you think about it? I want to decide on GPUs with reference to your opinion.: )

Thanks!!