Nvidia Touts (Quiet) 'Fastest DX11 GPU on Planet!'

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]servarus[/nom]Sapphire Vapor-X no? Seems like ATI still would be leading in terms of technology.[/citation]
how do you figure?
 
The card has been launched. In a few hours we will have benchmarks (hopefully here at Tom's too).

Techpowerup has a review up, but it's down right now. I figure they realized (or Nvidia reminded them) they must probably wait until it's past midnight in the US and not in Europe)

check it out, some people actually were able to read it before it was taken down:

http://forums.steampowered.com/forums/showthread.php?t=1570525

And PC Magazine also announced it officially too, it's released TODAY, Nov 9th 2010:

http://www.pcmag.com/article2/0,2817,2372324,00.asp (I actually saw this one pi person, but it's since taken down too)

So guys, it'll only be a few more hours!
 
[citation][nom]ThermalDynamics[/nom]@WhySoBluePandaBearwhen we saying relatively low heat, is that with or without the liquid cooling solution......when they attach a liquid cooling as part of their reference design it tells me this thing is going be pushing some serious heat without itand yes people who think that it needs to reach 100c for a liquid cooling solution to work are a little annoying, but not everyone has the learned understanding of thermal dynamics[/citation]


Yeah, and a car engine would burn up, seize and cease to function without proper cooling. A lot of things would be insanely hot without adequate cooling. I don't get what you're trying to say otherwise...like really, does it matter what type of cooling the manufacturer is using? You hardly heard of people complaining when GPU's adopted fans...back in the day they used to use passive cooling.
 
Vapor Chamber cooling is just an advanced form of heat pipe that is already being used on many graphics cards and CPU coolers. Instead of being a pipe, it is flat and can transport heat in a 3D pattern away from the heat source instead of in one direction like a traditional heat pipe. A heat pipe has been designed to use capillary action to return the condensed fluid back to the cool side of the device using a wicking device to counteract gravity.
For all you worried about the orientation of heat sink below the GPU and the working fluid not getting back to the GPU to cool it you have not even looked at your CPU cooler. In many high end tower CPU coolers you have multiple "L" shaped heat pipes. Now the most common configuration is a Tower case with the processor pointing to the left as you look at the case from the front. This puts the Heat Sink on its side with short leg of the "L" of the heat pipe in the vertical direction and the long side of the "L" going into the heat fin array. Now by your reasoning only the heat pipes with the short leg of the "L" going up would work and the ones going down would not work resulting in only the top half of the Heat Sink doing any good. This is not true and you recognize this as you do not getting any better cooling out of your CPU Heat sink if you orient your case as a desktop, where all the short legs are horizontal. There is a wick inside the heat pipe to pull the working fluid back towards the cool side of the heat pipe. This is why the Heat pipes with the short leg of the heat pipe pointing down in your CPU Heat sink still work.

The same thing goes on with Vapor Chamber cooling as it does in the heat pipe except instead of transferring heat in only one direction it can transfer heat in all directions more rapidly away from the heat source, think heat spreader. This heat spreading allows you to transfer more heat away from the processor faster then pure conduction allows, standard heat sink. This helps to make more of the heat sink actually useful by transferring the heat to the outer edges of the heat sink faster where it is typically cooler then the center as heat poorly conducts from the center of the heat sink to the extremities. By conducting the heat away from the center faster then pure conduction you have better cooling and by getting the heat spread to more of the heat sink you allow more of the heat sink to be effectively used.

As for concern with Vapor chamber and water dripping onto your computer, forget about. These are sealed pressure vessels with little liquid in them. The Nvidia guy says it contains water, and it might but many different types of fluids are used based on the temperature range that the heat pipe/vapor chamber is designed to work effectively at. They do have a range where they are effect as part of the working fluid has to be a gas and part a liquid. If the working fluid is only a liquid then your just doing conduction only and heat is being transported by the case of the heat pipe/vapor chamber. The same thing happens if all the liquid has boiled and is a vapor, no additional heat transfer from just a vapor.

What really makes a heat pipe/ vapor chamber so effective is the energy absorbed during vaporization. Remember back to your high school chemistry class that the amount of energy required to change phases is significantly higher then the energy required to just raise the temperature of a material.

Don't be concerned about the amount of liquid in a heat pipe/ vapor chamber as it is miniscule amount. And for all of you have worried about your GPU running at a 100°C, don't. As many have already highlighted and correctly pointed out by vary the pressure above the liquid you change the temperature at which a fluid boils. By reducing the pressure inside the heat pipe / vapor chamber you can get the fluid to boil at a lower temperature. The boiling point can also be changed by using a different fluid.

As for being worried about a vapor chamber, they have been in use by NASA, the military, and anyone else who needed to transfer heat away from location quickly. It just has not been in wide spread use in the personal PC environment due to costs and the need. Remember if was only a few short years ago when you did not even have a fan on your graphics card at all and before that there was not even a heat sink. Okay, maybe it has been more then a few years since you did not need a heat sink on your graphics card, but there was a time. I know I am dating myself. It is only necessity that has driven Nvidia to use a vapor chamber with a heat sink to cool the GPU.

Obviously the card being described is hot and requires a more effective cooling then can be derived with just heat pipes spreading heat in only one direction. The vapor chamber will help a lot to cool the card. If they don’t find a way to cool the GPU and spread the heat more effectively we are going to see more and more three slot graphics cards become the norm. If they could improve the cooling side maybe we could get back to single slot graphics cards at the high end, wishful think I know. This would actually allow you to use more of the slots on your motherboard.

I totally disagree with the individual who said his graphics card could be a 100 db (90db approximate level for hearing damage, approximate level of Jet Engine) if they could get 2x-3x more performance. Noise has become a serious issue. This is why you have the various PC System Design Guide standards (PC 97, PC 98, PC 99 PC 2001, etc) that regulated not only power usage but also noise. Noise regulation is important as it takes away from your ability to focus and work on a task. Noise regulation is also important as excessively loud noise interferes with your ability to be immersed in a game or movie. If you ever have watched a movie without the sound it looses a significant portion of the immersion and the emotional effect, same thing if you are trying to watch a movie with a vacuum cleaner going on right next to you. I personally say "Thank you" for the addition of the Vapor Chamber to allow better, more efficient, quieter cooling that will allow for quieter operation.

Of course I agree with one of the other posters who said that they (Nvidia and AMD/ATI) really need to focus on reducing the amount of heat generated in the first place, die-shrink anyone ala 32 nm instead of 45 nm. To bad that TSMC can't get there lithography processes working correctly so no die shrink for these reworked DX11 parts this year from either Nvidia or AMD/ATI. Die-shrink is what allowed Intel to add two more cores to the i7 up to 6 cores make a smaller processor and still not increase power consumption.

Unfortunately the gaming world has not been as vocal about the reducing power usage as the enterprise/industrial side has been about reducing power consumption, ie Data Centers that would run out of power capacity, but still have lots of floor space. This is why you see Intel and AMD working so much harder to reduce power consumption more and talk about performance per watt then the graphics side of the business. If only gamers would also demand that Nvidia and AMD/ATI actually work on improving graphics performance as much as power consumption, we would not need these advance cooling methods and ridiculously large power supplies. It would also mean that we would see real improvements in the laptop graphics that we are not seeing with the last few generations of graphics cards. The real performance seen in desktop cards has been at the expense of huge increases in power consumption like the days of the Pentium II to Pentium 4 where power consumption increased faster then performance. Unfortunately the laptop side of the world can not handle these huge increases in power consumption and thus the lack of improved graphics cards in laptops. Think about it the high end Nvidia GTX 480M laptop graphics is no better then a Nvidia GTS 450 desktop card, how pathetic is that. Sooner or later the graphics side of the business is going to have to address the power and resultant heat side of performance just like the CPU players Intel and AMD.

Nvidia says they really care about gaming, but they are more focused on turning their GPU into a Processing Unit to offload heavily threaded work from the CPU. It is clear why this is as Intel can put a reasonable performing processing unit right on the CPU package. Yes, the current core i3, i5 graphics can't play any modern games, but it is only a matter of time before the bottom and mainstream market is totally eroded out from underneath Nvidia. Nvidia recognizes this and while we gamers care about graphics performance and spend lots of money, we are pocket change in the real scheme of things and Nvidia will go the way of the dodo bird if they do not evolve their business.



For all you people whining about the use of vapor chamber, 100°C operating temperature, gravity, etc. Why don't you use some of your time to do a freaking Google search or Wikipedia search and learn something instead? It amazes me how some will spend all day trying to figure out how to overclock your CPU or graphics card to get a few fps or MHz, but can't even look up the technology that allows you to get those few MHz or fps. Use that big lump of mass in your head called your brain, that is if you actually have one and do a search instead of just being a “me too poster”.

 
Nice post Darkenergy! Was hoping someone would put an end to the ridiclous comments on this thread. A few more points for those unwilling to Google for answers:
* Vapor chambers have been around for a long time, just not in this industry. I first saw them a couple decades ago working in the defense industry.
* Vapor chambers and heatpipes have a low internal pressure, which reduces the water's boiling point. I'm guessing it's tuned to boil around 30-40C for this application.
* The inside of vapor chambers and heatpipes are lined with a wick structure that absorbs condensed water and returns it back to the heat source through capillary action. This makes the system indepedent of gravity.
* There usually isn't any water in the chamber that is not contained within the wick when the system is cold. If you cut one in half with a saw, you likely wouldn't get any water out, so there isn't much worry about leakage.
 
I want to see temperature reviews with the card on different positions. I think that the most common position for the card is the worst for this cooling solution.
 
@WhySoBluePandaBear

Point being that not even CPU requires liquid cooling by default (granted they do have bigger cooling devices), im just suspicious about the efficiency of a GPU that requires liquid cooling by default considering the competition does not, this thing has to outdo the competition by a significant major to justify the liquid cooling by default requirement
 
Thank you darkenergy for that enlightening post.

[citation][nom]nebun[/nom]how do you figure?[/citation]
Well, they were the first to use the Vapor Tech, the DX11, Dual GC, plus their GC has more on the low energy and noise.

Not that I say that NVDIA is lacking, they have their plus, the CUDA core, 3D Vision and such. It's appealing, but to spend my cash for it, it does not seem, well, worth it.

But seriously, as a basic user, I care more on the price performance ratio, technology implementation, heat and power consumption, noise level all of which NVDIA seems to be lacking. To me at least.
 
[citation][nom]servarus[/nom]Thank you darkenergy for that enlightening post.Well, they were the first to use the Vapor Tech, the DX11, Dual GC, plus their GC has more on the low energy and noise.Not that I say that NVDIA is lacking, they have their plus, the CUDA core, 3D Vision and such. It's appealing, but to spend my cash for it, it does not seem, well, worth it.But seriously, as a basic user, I care more on the price performance ratio, technology implementation, heat and power consumption, noise level all of which NVDIA seems to be lacking. To me at least.[/citation]
good point, this technology is not for a "basic" user, therefore your comment does not apply. nvidia makes wonderful products, plus their drives are way better than ati's. i have owned product from both companies and nvidia has given me the best products and drivers. i love how powerful cuda is compared to stream
 
Yes indeed! Finally someone who gets it! He was testing the new Porsche 911 Spyder, against the V12 R8 I think.. if I remember correctly.

Anyways on topic, amazed at how many people do not get this vapor chamber technology, and even more amazed that nVidia is desperate enough to take up Apple marketing: Take something thats long been out, and pretend that they have invented it... :pfff:
 
Status
Not open for further replies.