Nvidia GeForce GTX 1070 8GB Pascal Performance Review (Archive)

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I don't think any card will ever be able to run "all" games on ultra at 4k, every time a new "AAA" game comes out it may try to push the graphic bar not to mention some games are just coded bad. Now if your saying your SLI 970's (which don't scale in a all games again because of coding) is working good for you and you don't see a reason to upgrade and pay the extra $$ well then that's great.
 
And what happens when the cards are overclocked ? Comparing cards in bar graph charts without reference to relative overclocking abilities is like McDonalds posting the nutritional information for a raw potato in a review about their french fries. I kinda wanna know what it does to my arteries before I have that meal.

Whether or not one chooses to OC is an individual choice but not to provide the information does the reader a disservice.

Without that, I have no way to judge the performance ... the 980 Tis overclocked had fps improvements in the 30% range (non - reference) ... relevant no ?

12.8% reference 1080... https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/30.html

compared to the reference 980s ...

14.6% reference 80 ... https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_980/29.html

Considering that with the 9xx series cards the 3rd party offerings delivered as much as a 31% fps increase over the reference card, I would not be in much of a rush to be the first to have the "latest and greatest".
 


4k isn't ready for prime time... I can't see investing in 4k until monitors exist that can do ULMB and cables exist that can carry the 144 / 165 Hz signal. Two 970s works quite well at 1440p on the fast IPS panel ... I wouldn't do 4k until cards exist that can match those frame rates w/ ULMB enabled.




We heard the same thing with the 3xx series.... then the 970 outsold all AMD 2xx and 3xx series cards combined. Don't get me wrong, I'd love to see AMD put something competitive on the table abone the 380/380x ... but recent history has shown that the pre-launch "will be's" just have not lived up to these expectations.
 


correction: "no cheap card"

and why 4k ? upgrade to 1440p or to 34 inch

and your GTX 970 cant do 1440P either but 1070 CAN
 
Well in a game that CF scales well/correctly I'm pretty sure 2x380 would be faster than 670, of course cf/sli drivers tend to always behind a single card configuration and rarely scale to what people except or hope to see not to mention the additional power and cooling that is normally required. Also outsold while a good metric for business success doesn't mean it a better product (as a general rule).
 


min fps never scales ... it is the average fps that scales well , thats why I always buy best card before any multi card. after getting the best card I think about Multi if needed.
 


There will be a time. When 4K becomes mainstream. And this time will come very soon.
I own 24" 1920x1200 for 9 years. Back than, it was "high end". OC there was an "ultra high end" 2560×1600. But most people used something between 1024x768 and 1680x1050. And there was no card able to run "everything on ultra" on those high ends (Crysis anyone ?). And than there was 😛

Today most reviewers and bloggers are talking about 4K performance. But honestly, who cares ... in a year or two, any discreet card will be able to handle 4K. Because there are no low-end cards anymore. You can expect that dumbest Nvidia 1000 card will perform as GTX970 (same for the red team). That means next gen, we are talking about 1070 or slightly less, as minimal performance for discreet cards. Those cards will start at around 170-200USD. They will finally push the game developers towards things like real time ray tracing (or similar tech) for awesome graphics. Because there is no point to push resolutions beyond 4K for monitors. May be for the walls :) But the only way for the discreet graphics to survive, is to deliver better image over bigger one.
 

Not until 4k reaches prices parity with 1080p, which may take a few more years. 1080p didn't become truly mainstream until 1080p monitors and TVs reached price parity with 1366x768 "HD" models either to finally put that BS to rest. (Well, not quite since tons of entry-level laptops are still using 1366x768 or similar BS resolution displays.)
 


4K monitors are already mainstream , you can find a good one as low as $350

http://www.newegg.com/Product/Product.aspx?Item=N82E16824022258&cm_re=monitor-_-24-022-258-_-Product

... thats very cheap for 4K monitors .. we bought our FHD monitors the same price mainstream in the past.

what we miss is a $300 GPU that can handle 4k at 60 fps.

 


1080p is a waste with either a 1080 or 1070. that's why it is not included. if you are only playing 1080p, then stick with a 970 or wait for the 1060 to show up whenever that is. it's not worth the time t test just to say "look at that big number". if you REALLY need them, look for 980ti/titan x 1080p numbers and add a few to that. oh wait there are few of them as well, since they are also WAY overkill for 1080p.


1080p is a waste with either a 1080 or 1070. that's why it is not included. if you are only playing 1080p, then stick with a 970 or wait for the 1060 to show up whenever that is. it's not worth the time t test just to say "look at that big number". if you REALLY need them, look for 980ti/titan x 1080p numbers and add a few to that. oh wait there are few of them as well, since they are also WAY overkill for 1080p.


1080p is a waste with either a 1080 or 1070. that's why it is not included. if you are only playing 1080p, then stick with a 970 or wait for the 1060 to show up whenever that is. it's not worth the time t test just to say "look at that big number". if you REALLY need them, look for 980ti/titan x 1080p numbers and add a few to that. oh wait there are few of them as well, since they are also WAY overkill for 1080p.


1080p is a waste with either a 1080 or 1070. that's why it is not included. if you are only playing 1080p, then stick with a 970 or wait for the 1060 to show up whenever that is. it's not worth the time t test just to say "look at that big number". if you REALLY need them, look for 980ti/titan x 1080p numbers and add a few to that. oh wait there are few of them as well, since they are also WAY overkill for 1080p.

I totally disagree. This is the first card that can actually max out 1080p.
When I play, with my two or 3 monitors, just the main 1920x1200 for gaming, I don't want to have to compromise on settings, and so far every single card has caused me to do so.
I have an open skype window running on a second monitor, and a youtube vid on a third - That can tax the always SINGLE USE reviews results into oblivion... not that they show any overkill on 1080p anyway.
So NO- the 1070 is the very first card that looks to be able to handle 1080p properly, on a real system , in real life, without always having to compromise.

Remember, we're not all running the latest 16 core overclocked Xeon - the results you see above are on the choicest of setups with ultra clean installs and anything extra NOT FRIKKIN RUNNING - and they've told you this plenty so to pretend the above results are real life for end users is to be very, very thick and inexperienced.

You're welcome.
 


I don't know if this has been posted here at all but I saw this over on Guru3D the other day.

http://www.guru3d.com/articles-pages/msi-geforce-gtx-1080-gaming-x-8g-review,1.html
 


I guess mainstream price is around 200 or even less for the monitor.
But it's not about the price point as about price difference. For example you have 24" monitors:
1080p - lowest price is 99USD
1440p - lowest price is 191USD
4K - lowest price is 399USD (your example is 23.6")
So there is a 200USD difference between 1440p and 4K and 300USD between 1080p and 4K.
That's a huge difference. You can have 2 of lower resolution monitors and have some change left.
or you can have a 1080p monitor + decent GPU for the price of one 4K monitor.
and frankly, form the distance of 0.8-1.2 meter, on 24" monitor picture quality of 1080p is fine and you will have a hard time to tell the difference between 1080p and 1440p.
To be able to enjoy (actually see the difference of) 4K over 1080p/1440p on 24" monitor, you have to be at most 1 feet (30cm) from the monitor :)
And yes, I know that most people don't know that and will buy marketing BS as they do with the phones :)
 
In Sweden pricing is no where near the msrp, the 1080 is priced at 900€ so all talks of value is marketing bs. At 400$ for the 1070 pricing should land at under 3500-4000sek but ill bet it will land at 5000 and thus out of reach for mid-market with only rich people left and they will probably pick the fasterst card (1080) and not the 1070

True the non-US pricing madness.

Not 30 mins ago, here in Saigon, I just saw a GTX960 for $361 USD.

At the airport yesterday, I felt like buying one of this conference speakers, 'Esquire' conference speaker from HK. Price was 9,830 THB ($277). I held it, liked it, fired up the smartphone, and noticed it was $107 on some well known shopping site that reminds me of a jungle.

Still don't wanna move back to the US though... :)
 


we never bought cheap 100$ monitors even as students .. those are not mainstream , those are for really poor people ..

even at the days of CRT we allways bought Trinitron Monitors remember ? were $500 minimum 19 inch 1280x1024/1600x1200 , so a 24 inch 4K for $350 is CRAZY CHEAP .

This $100 monitors is crap and you know it :)

 


And we also know that this 350-400USD 4K monitors might be even more crap and definitely not less crap :)
And yet , I doubt I'd pay 1000 or more for a monitor today even though today it's a much smaller fraction of my income :)
Over the years, i found that:

  • color reproduction accuracy - don't care.
    Fancy name - don't care.
    Magic bulshit numbers - don't care.
    Ports and expandability (USB hub, Card reader) - don't care.

I use the monitor for gaming only.
I do not work with photo/video editing. I don't consume multimedia on the PC. I even don't work on it.
And for gaming cheap TN panels are the best. So the only thing that is important for me, is vesa mount :)

Basically I don't buy 34" 3440x1440 because they start at 1.3K where I live (what you see on amazon for 750). I would though for 500-600 :)
 


I wouldn't call it BS. Most people probably don't even know what their laptop resolution is. If it's not really negatively affecting their email, Twitter, web-based Flash games, it's not really a problem.
 
+Samer1970 I bought a Sony Trinitron 17" CRT for $500 years ago. It was this gorgeous completely flat screen. I believe it had a 1024 x 768 resolution and the thing had a beautiful image.. I want to say the dot pitch was .22, but I'm not entirely sure. I didn't read all the earlier responses from today, so I'm not taking part in any online debate. Just wanted to reminiscence.
 
people who are going to do 970 sli will still not get better performance then single gtx 1070 i wish that this nvidia had change the sli like amd did that u can run 2 different gpus in cf
 

I think $350 is well above mainstream pricing for monitors of any resolution and is only considered 'cheap' by enthusiasts and highish-end gamers. The majority of people (~85% based on the Steam survey) cannot be bothered to pay much over $200 for a GPU and I doubt many of them could be bothered to pay more than that for a monitor.

If it weren't because I absolutely wanted a 1200p monitor with fully adjustable base as my main monitor, I'd have bought a ~$150 1080p display instead of a $280 Dell UltraSharp two years ago.
 


Thats because the new Generation dont know the difference between a $150 monitor and $350 one ..

I miss old days when you couldnt find cheap monitors around .. Quality stuff only ... mid to High end ... $350 minimum to $1500


I guess I am from earlier generation then :)

my old 340MB harddisk still works , compare it to modern cheap drives ? you will be Lucky if they last 4 years...


The acceptable standard are becoming lower over time. because manufacturers decided to go cheap and lower quality products with short warranty. people dont understand this so they just don pay. in the past they did not exist at all.

I had like old 25 hard disks in the basement and one day decided to try em out and I was expecting the older ones fail before the newer ones , they ranged from 1990's till 2007 , I was surprised that the oldest was still working and some of the newer ones? the motor never spinned ! dead !

in short : old time entry Level = modern time Middle level.
 

The standards are going lower and lower because most consumers cannot be bothered to pay an extra $100 for otherwise seemingly identical products and there is also the issues of low quality products being sold at premium prices, further blurring the line between good and poor quality - you do not necessarily get more for paying more, even from better-known brands. By the time reliability data about products becomes representative so people may be able to base their purchasing decisions on it, the products have already been discontinued and the previous models' assessment may not necessarily be representative of how much better or worse the replacement models are.

The only thing that might stop this race to the bottom and cutting corners until products fail just outside the manufacturer warranty is to make such environmentally irresponsible designs more costly in the end to encourage responsible design and a return to the engineering philosophy from 20-30 years ago where companies focused on building the best <whatever> they could for a reasonable price and word-of-mouth recommendations instead of aiming for the lowest price possible and heavily hype-based marketing.

Some places have consumer protection laws which may impose longer legal warranties on some goods (ex.: 10 years on TVs where I live) but unless a large proportion of consumers exert their right to expect their stuff to last much longer than what is stated on the manufacturer warranty, it'll remain business as usual for most manufacturers.
 


Most hobbyist / enthusiast sites do (Guru3d, Techpowerup, etc) .... "media empire" sites that worry about offending advertisers do not.

To my eyes, it's a glaring journalistic irresponsibility, especially on a PC enthusiast site, to ignore the fact that up and down the line, there is a huge gap here between the two major players.

There's a lot to be admired about AMDs approach which puts very aggressively clocked cards in the box. Pushing these cards so close to their max, must mean a higher number of units out there that won't remain stable at these "factory OCs" and when peeps complain, they have to honor the advertised clock and replace the card. I have had the same thing happen with factory OC'd cards from EVGA.

But looking at Techpowerup's testing for example ....

Fury-X fps increase when OCd = 105.05% / 980 Ti fps increase when OCd = 131.38%
390x fps increase when OCd = 107.12% / 980 fps increase when OCd = 122.71%
390 fps increase when OCd = 108.21% / 970 fps increase when OCd = 117.11%

Interesting trends ....

1. nVidia's overclocking ability increases as you go up in price / performance / AMDs goes down
2. The OC difference at each level is enough to "jump a tier" ... whereas the 970 for example is supposed to compete with the 390, once overclocked, the 970 is notably faster than the 390x at 1080p and a hair faster at 1440p. That win at 1440p is so close it really can't be called a "win"; a tie is more suited but there's been a $100 price difference there and a tie doesn't want to make a person pay a cost premium..

Simply put, if a user wants to get information about "what card to buy", I don't see many enthusiasts looking to even bother reading a review which doesn't cover this issue ... have had many users say "I don't want to OC now, but I want it to be capable if I choose to do so in future".

One of the reasons i think that consumers are less savvy then they were years ago is that in depth, data rich reviews are very hard to find. How else would you see post saying ....

".. just buy whatever [insert model here] GFX card is the cheapest, you can always overclock the card to that of any other [same model] card .... no you can't. Cards whose PCBs contain higher quality components (VRMs, power delivery, capacitors, chokes. MOFSETS and how thise are all individually cooled have a very significant effect on card performance.

http://www.bit-tech.net/hardware/graphics/2014/09/19/nvidia-geforce-gtx-970-review/1

".. .I want to buy a $190 IPS 1080p monitor cause IPS is better" ... an IPS panel is better at certain things but to take advantage of what it offers, you need to get above the price floor for a quality IPS panel, $190 is below that floor
 
Status
Not open for further replies.

TRENDING THREADS