Kepler news and discussion

Page 14 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Ok whatever, don't believe me, but mark my words. I know i know everyone makes this stupid lies , but this one is the truth. The guy i speak of is one of the founders of LPT. and he told me and several others that he has a inside guy with intel.

Anyways continue with kepler this isnt a Ivy Bridge thread
 
for those who are interested with GTX680 2GB (SLI) vs HD7970 3GB (CFX):

http://www.hardocp.com/article/2012/03/28/nvidia_kepler_geforce_gtx_680_sli_video_card_review/9

The biggest question in regards to performance and gameplay experience about GeForce GTX 680 SLI was if the 2GB of VRAM per GPU and lesser memory bandwidth compared to Radeon HD 7970 would be a hindrance. Our testing has clearly answered that question. In fact, in every game we tested, GTX 680 SLI offered a better gameplay experience compared to Radeon HD 7970 CrossFireX. We specifically tested at NV Surround and Eyefinity at the maximum resolution of our configuration at 5760x1200 to see if there would be any bottlenecks. We found that the new GeForce GTX 680 SLI has the performance where it counts.

We know exactly what you guys are thinking. The Radeon HD 7970 has 3GB of VRAM, the GeForce GTX 680 has 2GB; the Radeon HD 7970 has 264GB/sec of memory bandwidth and the GeForce GTX 680 has 192GB/sec of memory bandwidth. You'd expect Radeon HD 7970 CrossFireX to simply blow GeForce GTX 680 SLI out of the water at 5760x1200. The simple fact is, it does not, and in fact GeForce GTX 680 SLI provides a better gameplay experience with better performance. Amazing, but true. Obviously AMD’s driver engineers need to figure out how to utilize the hardware more efficiently, because at the moment, NVIDIA is taking AMD to school.

:)
 
yup. i got error loading toms site when submitting my initial post. when the page load again my post was not there so i think my post was not submitted. when i post my second the first posts show itself 😛
 

It happens.....
 
People, help me to get my sanity here!

We did expect some price wars by the time 680 released right? Something that might push AMD more honest at their lineup tags. But you see...well...it aint happening!! Why?? Why are prices stay still?? I'm running out of my patience here.

In fact, both AMD and nVidia top ends cards gets higher price than they were before, right now!! What is happening? 🙁

And here I am, confuse and don't know what to do with my $450 budget of my hard earned saving for best card I could get...and any other option at that price I already matched by 5850 CF.

I dunno about you rich people but, for once in my PC gaming lifetime, I want to own the best card at their time. I'm tired of getting second hands. And I want to do it right now, since I'm about to marry my girl by next year. I don't think gaming would be viable at that times.

If things keeps up like this for another month, I'd spent my money on pimping my car. And get by with my 5850 CF for the rest of my life...
 
Nvidia have done well this time around. Wonder why it is called GTX 680 though. It is their flagship model, but aren't they going to release more GTX 6 series cards?
 


A price war (or at least a reduction in 7970/7950 prices) may occur once the 680 is available for more than a couple of hours before selling out. Fact is if you go to buy a graphics card today you can't buy a 680. 7950 @ $450+ or 7970 @ $550+ is what you can choose between. We need more cards! :ange:
 

Dude.....You gotta be patient i know lower and mid range Kepler cards are to be out in the first week or two of April. So be patient. Only have another 3 weeks at most before your price range model will be available for purchase
 

This caught my eye from that review:
"With the nature of GPU Boost, both video cards can run at independent clock speeds. This means that GPU Boost can be boosting each GPU separately as it needs to for the best performance, the clock speeds don't have to match up. This provides the best performance at all times based on GPU utilization, power, temp and other things."

I wonder if that means you can run two different cards with different base clock speeds without one downclocking to match the lower clocked one. Like when people have a reference model paired with a superclocked model.
 

No the Chip Manufacturer (TSMC) or whatever they are called, needs to get off their tail @$$ and get to making more 28nm chips so they can distribute more to Intel NVIDIA and....AMD. AMD was the first to buy most, then Intel, and NVIDIA is last in line.....So its kinda like Tiwanese needs to quit being slack on purpose and get to work. we have a GTX 680 shortage because their isnt enough chips
 

And this REALLY caught my eye:
SLI smoothness vs. CrossFireX smoothness

We don't know what other descriptive word to use, other than "smoothness" to describe the difference we feel between SLI and CrossFireX when we play games. We've expressed this difference in gameplay feeling between SLI and CrossFireX in the past, in other evaluations, and we have to bring it up again because it was very apparent during our testing of 680 SLI versus 7970 CFX.

We can't communicate to you "smoothness" in raw framerates and graphs. Smoothness, frame transition, and game responsiveness is the experience that is provided to you as you play. Perhaps it has more to do with "frametime" than it does with "framerate." To us it seems like SLI is "more playable" at lower framerates than CrossFireX is. For example, where we might find a game playable at 40 FPS average with SLI, when we test CrossFireX we find that 40 FPS doesn't feel as smooth and we have to target a higher average framerate, maybe 50 FPS, maybe 60 FPS for CrossFireX to feel like NVIDIA's SLI framerate of 40 FPS. Only real-world hands on gameplay can show you this, although we can communicate it in words to you. Even though this is a very subjective realm of reviewing GPUs, it is one we surely need to discuss with you.

The result of SLI feeling smoother than CrossFireX is that in real-world gameplay, we can get away with a bit lower FPS with SLI, whereas with CFX we have to aim a little higher for it to feel smooth. We do know that SLI performs some kind of driver algorithm to help smooth SLI framerates, and this could be why it feels so much better. Whatever the reason, to us, SLI feels smoother than CrossFireX.

Personally speaking here, when I was playing between GeForce GTX 680 SLI and Radeon HD 7970 CrossFireX, I felt GTX 680 SLI delivered the better experience in every single game. I will make a bold and personal statement; I'd prefer to play games on GTX 680 SLI than I would with Radeon HD 7970 CrossFireX after using both. For me, GTX 680 SLI simply provides a smoother gameplay experience. If I were building a new machine with multi-card in mind, SLI would go in my machine instead of CrossFireX. In fact, I'd probably be looking for those special Galaxy 4GB 680 cards coming down the pike. After gaming on both platforms, GTX 680 SLI was giving me smoother performance at 5760x1200 compared to 7970 CFX. This doesn't apply to single-GPU video cards, only between SLI and CrossFireX.

 
I saw that too, I can't remember for sure, but I thought Tom's had a write-up on SLI/Crossfire micro-stutter. What I remember was depending on the game, SLI or CrossfireX had more micro-stutter, but there was no clear winner. So that is interesting to read.
 


It was a good article- found here - August of 2011. They pitched the HD 6870 (in single, crossfire x2, and crossfire x3 [using a dual gpu card and a single gpu card]) against the GTX 560. The results were very interesting but I considered the scope of the analysis to be very narrow. If it is just a comparison of one mid-level model from each camp it is difficult to make sweeping generalizations about Crossfire and/or SLI. The question that the article did NOT put to bed was "do the high end offerings from each camp suffer/enjoy the same microstuttering, or lack thereof, shown in the article?" The quote posted about 680 SLI vs 7970 crossfire suggests the answer is "yes", but an article specifically aimed at addressing the issue would be great.

So, this is my formal pitch for an updated story with more cards once the 28nm lineups fill out more. I know it is REALLY extensive testing (each card gets tested in each benchmark 2 or 3 times) but this seems to be a very hot button issue these days.

To tie it back in to this thread (since this is after all a Kepler discussion thread), I think this falls under the realm of perceived/actual superior driver support enjoyed by NVidia. As mid-level Kepler cards come out we hope to see them hit price points below AMD's current 7xxx offerings with superior performance. If this SLI smoothness is evident in these mid-level cards an EXTREMELY killer dual card setup could be had for ~$500 that skates way past 680 and 7970. My $.02. Cheers.
 
The benefit that the smaller bundle brings to the table is that Palit can be a bit more aggressive on the price. While not official, talking to them we're expecting the card to come in at round $20 - $30 more than reference class models. This would continue to put the card in below the HD 7970 and when you're talking about a $500+ video card, you'd be hard press to not want the Palit version over the reference card for only slightly less money.
 

The article is saying $20-$30 more than reference.

As I learn more about the dynamic overclocking function, I'm convinced that improved cooling is the key to overclocking. Aftermarket cooling should allow the GPU to stay cooler, which will in turn allow for higher boost clocks. This has always been true, but with the hard-wired dynamic overclocking function, the boost clock is set in a direct relationship to the GPU temp and ASIC quality. As you can see in the chart, there are different clock speed levels corresponding to 10c and 5c increments. Key to maximizing the boost clock is to keep the GPU below 80c or even 70c (like maybe with the Palit Jetstream). Can't wait for the MSI Twin Frozr version.

clock_vs_temp.gif


You can check your own ASIC quality using GPU-Z. Just right-click on the menu bar and select "Read ASIC quality...":
http://www.techpowerup.com/downloads/2120/TechPowerUp_GPU-Z_v0.6.0.html
 
Oh well, its a beast card still. i guess ill either have to wait for free shipping for the 680 or just buy a 670 ti....
 
Status
Not open for further replies.