Vapor Chamber cooling is just an advanced form of heat pipe that is already being used on many graphics cards and CPU coolers. Instead of being a pipe, it is flat and can transport heat in a 3D pattern away from the heat source instead of in one direction like a traditional heat pipe. A heat pipe has been designed to use capillary action to return the condensed fluid back to the cool side of the device using a wicking device to counteract gravity.
For all you worried about the orientation of heat sink below the GPU and the working fluid not getting back to the GPU to cool it you have not even looked at your CPU cooler. In many high end tower CPU coolers you have multiple "L" shaped heat pipes. Now the most common configuration is a Tower case with the processor pointing to the left as you look at the case from the front. This puts the Heat Sink on its side with short leg of the "L" of the heat pipe in the vertical direction and the long side of the "L" going into the heat fin array. Now by your reasoning only the heat pipes with the short leg of the "L" going up would work and the ones going down would not work resulting in only the top half of the Heat Sink doing any good. This is not true and you recognize this as you do not getting any better cooling out of your CPU Heat sink if you orient your case as a desktop, where all the short legs are horizontal. There is a wick inside the heat pipe to pull the working fluid back towards the cool side of the heat pipe. This is why the Heat pipes with the short leg of the heat pipe pointing down in your CPU Heat sink still work.
The same thing goes on with Vapor Chamber cooling as it does in the heat pipe except instead of transferring heat in only one direction it can transfer heat in all directions more rapidly away from the heat source, think heat spreader. This heat spreading allows you to transfer more heat away from the processor faster then pure conduction allows, standard heat sink. This helps to make more of the heat sink actually useful by transferring the heat to the outer edges of the heat sink faster where it is typically cooler then the center as heat poorly conducts from the center of the heat sink to the extremities. By conducting the heat away from the center faster then pure conduction you have better cooling and by getting the heat spread to more of the heat sink you allow more of the heat sink to be effectively used.
As for concern with Vapor chamber and water dripping onto your computer, forget about. These are sealed pressure vessels with little liquid in them. The Nvidia guy says it contains water, and it might but many different types of fluids are used based on the temperature range that the heat pipe/vapor chamber is designed to work effectively at. They do have a range where they are effect as part of the working fluid has to be a gas and part a liquid. If the working fluid is only a liquid then your just doing conduction only and heat is being transported by the case of the heat pipe/vapor chamber. The same thing happens if all the liquid has boiled and is a vapor, no additional heat transfer from just a vapor.
What really makes a heat pipe/ vapor chamber so effective is the energy absorbed during vaporization. Remember back to your high school chemistry class that the amount of energy required to change phases is significantly higher then the energy required to just raise the temperature of a material.
Don't be concerned about the amount of liquid in a heat pipe/ vapor chamber as it is miniscule amount. And for all of you have worried about your GPU running at a 100°C, don't. As many have already highlighted and correctly pointed out by vary the pressure above the liquid you change the temperature at which a fluid boils. By reducing the pressure inside the heat pipe / vapor chamber you can get the fluid to boil at a lower temperature. The boiling point can also be changed by using a different fluid.
As for being worried about a vapor chamber, they have been in use by NASA, the military, and anyone else who needed to transfer heat away from location quickly. It just has not been in wide spread use in the personal PC environment due to costs and the need. Remember if was only a few short years ago when you did not even have a fan on your graphics card at all and before that there was not even a heat sink. Okay, maybe it has been more then a few years since you did not need a heat sink on your graphics card, but there was a time. I know I am dating myself. It is only necessity that has driven Nvidia to use a vapor chamber with a heat sink to cool the GPU.
Obviously the card being described is hot and requires a more effective cooling then can be derived with just heat pipes spreading heat in only one direction. The vapor chamber will help a lot to cool the card. If they don’t find a way to cool the GPU and spread the heat more effectively we are going to see more and more three slot graphics cards become the norm. If they could improve the cooling side maybe we could get back to single slot graphics cards at the high end, wishful think I know. This would actually allow you to use more of the slots on your motherboard.
I totally disagree with the individual who said his graphics card could be a 100 db (90db approximate level for hearing damage, approximate level of Jet Engine) if they could get 2x-3x more performance. Noise has become a serious issue. This is why you have the various PC System Design Guide standards (PC 97, PC 98, PC 99 PC 2001, etc) that regulated not only power usage but also noise. Noise regulation is important as it takes away from your ability to focus and work on a task. Noise regulation is also important as excessively loud noise interferes with your ability to be immersed in a game or movie. If you ever have watched a movie without the sound it looses a significant portion of the immersion and the emotional effect, same thing if you are trying to watch a movie with a vacuum cleaner going on right next to you. I personally say "Thank you" for the addition of the Vapor Chamber to allow better, more efficient, quieter cooling that will allow for quieter operation.
Of course I agree with one of the other posters who said that they (Nvidia and AMD/ATI) really need to focus on reducing the amount of heat generated in the first place, die-shrink anyone ala 32 nm instead of 45 nm. To bad that TSMC can't get there lithography processes working correctly so no die shrink for these reworked DX11 parts this year from either Nvidia or AMD/ATI. Die-shrink is what allowed Intel to add two more cores to the i7 up to 6 cores make a smaller processor and still not increase power consumption.
Unfortunately the gaming world has not been as vocal about the reducing power usage as the enterprise/industrial side has been about reducing power consumption, ie Data Centers that would run out of power capacity, but still have lots of floor space. This is why you see Intel and AMD working so much harder to reduce power consumption more and talk about performance per watt then the graphics side of the business. If only gamers would also demand that Nvidia and AMD/ATI actually work on improving graphics performance as much as power consumption, we would not need these advance cooling methods and ridiculously large power supplies. It would also mean that we would see real improvements in the laptop graphics that we are not seeing with the last few generations of graphics cards. The real performance seen in desktop cards has been at the expense of huge increases in power consumption like the days of the Pentium II to Pentium 4 where power consumption increased faster then performance. Unfortunately the laptop side of the world can not handle these huge increases in power consumption and thus the lack of improved graphics cards in laptops. Think about it the high end Nvidia GTX 480M laptop graphics is no better then a Nvidia GTS 450 desktop card, how pathetic is that. Sooner or later the graphics side of the business is going to have to address the power and resultant heat side of performance just like the CPU players Intel and AMD.
Nvidia says they really care about gaming, but they are more focused on turning their GPU into a Processing Unit to offload heavily threaded work from the CPU. It is clear why this is as Intel can put a reasonable performing processing unit right on the CPU package. Yes, the current core i3, i5 graphics can't play any modern games, but it is only a matter of time before the bottom and mainstream market is totally eroded out from underneath Nvidia. Nvidia recognizes this and while we gamers care about graphics performance and spend lots of money, we are pocket change in the real scheme of things and Nvidia will go the way of the dodo bird if they do not evolve their business.
For all you people whining about the use of vapor chamber, 100°C operating temperature, gravity, etc. Why don't you use some of your time to do a freaking Google search or Wikipedia search and learn something instead? It amazes me how some will spend all day trying to figure out how to overclock your CPU or graphics card to get a few fps or MHz, but can't even look up the technology that allows you to get those few MHz or fps. Use that big lump of mass in your head called your brain, that is if you actually have one and do a search instead of just being a “me too poster”.