Graphene Enhances Efficiency of Heat Spreaders by 25 Percent

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
G

Guest

Guest
"While reading this, all I could think off was... Why don't just try to make the whole thing off this material? Would it work without the copper? If possible, how lower temperatures can/could get?"

Simple answer: no. Much like a thin film of indium makes the thermal resistance of copper joints much better, it is not a very good conductor in bulk.
 

Johnpombrio

Distinguished
Nov 20, 2006
248
68
18,770
So I should grab a pencil lead and roll of Scotch Tape and stick it into my thermal grease? sounds like a Tom's Hardware DIY plan! Roll your own Graphene sheets.
 

bak0n

Distinguished
Dec 4, 2009
792
0
19,010
[citation][nom]Bloob[/nom]It means we might see passively cooled 78xx-series... ( or more likely 88xx or 98xx )[/citation]

That or the return to single slot GPU's enmasse.
 

A Bad Day

Distinguished
Nov 25, 2011
2,256
0
19,790
[citation][nom]bak0n[/nom]That or the return to single slot GPU's enmasse.[/citation]

Don't worry, there will always be wanna-be fermi GPUs by simply upping the voltage and clock rate. See Radeon 6990 and Nivida 590 as an example.
 
G

Guest

Guest
Yes, because indium just isn't in short enough supply as it is... and because 25% is some earth shattering gain...
 

shin0bi272

Distinguished
Nov 20, 2007
1,103
0
19,310
[citation][nom]Kamab[/nom]Graphene is extremely thermally conductive. Carbon has many allotropes, best not to generalize about all of them at once.[/citation]
Well I was referring to carbon in its natural state... which is graphite. Sure you can compress carbon and make a huge crystal out of it and that crystalline version of it is conductive but that's not what carbon looks like if you take a bunch of carbon atoms and mash them together at room temperature. Sort of like diamonds being the exception that proves the rule I guess you'd say. Anyway the original poing I made was that (if you read the entire article) the hotter the temp got the worse the conductivity got. Sure at what -40C it was "extremely thermally conductive" but at +77C it was almost even with copper. So it has almost a semi-conductive property to it... which is cool in its own right just not when it comes to my graphics card.
 

deksman

Distinguished
Aug 29, 2011
233
19
18,685
[citation][nom]__-_-_-__[/nom]yeah it would. but it would cost even more then your entire house.[/citation]

No it wouldn't.
Graphene as a material is abundant and our mass production technology allowed for the material to be used in chip production about 2 or 3 years after it was available for public knowledge.

To that end, diamond could have been used in electronics and chips (computers) decades ago (all the way back in the 70-ies if not 60-ies because first man-made diamond was made in 1950, so streamlining the process for 10 or 20 years would have been more than enough to incorporate it into consumer grade products).
It didn't because it's not a problem to mass produce cheap, high-quality man-made diamonds.
The problem is the diamond cartel DeBrees that holds the industry in its fist and controls supplies, plus keeps the prices high.

Resources/technology/manpower/capability is not the issue... it always comes down to 'money' as being the root problem.

Transitioning to diamond as a base in technology would effectively allow production of hardware that's at minimum 100x more powerful and efficient compared to current products.
Graphene is even better than diamond on all those grounds.

Essentially, it would kill all current hardware and companies would not be able to profiteer from making minor revisions anymore (which is exactly what they are doing for decades now).
People would not be inclined to upgrade for some time to come because the software alone would have a heck of a time catching up among other things.


 

Zingam_Duo

Honorable
Mar 22, 2012
289
0
10,780
[citation][nom]computernerdforlife[/nom]freggo: I'm glad you asked. French is my first language and i've lived in Germany for over a month. I thank you for considering my culture(French), my preferred spoken/written language(English), and my prefered country to travel to in Europe all in one message.Bonne journee et passe une bonne fin de semaine.[/citation]


Do you travel by tanks to Germany, because we travel only by tanks to France? :D
 

Augray37

Distinguished
May 4, 2011
601
0
19,010
I sincerely doubt we'll see this technology in real world use. You know what I mean? We never do. We'll see an article like this every other week, and then it's promptly forgotten. Idk, it's just frustrating. Why not run with this idea?
 

yumri

Distinguished
Sep 5, 2010
703
0
19,160
what i am wondering about is a Graphene-Copper heatsink and a silver thermal paste as it will work better for conductivity from what i have heard and read on here especially at the higher temps. Speaking of higher temps from the article if you use a Graphene-Copper heatsink then you will have a lower chance of getting to 77C or above on most devices anyways
Saddly i think this could be just a article on prelimatary research and not a finished product thus we will have to wait a few years for anything real to come about from the R&D into the materials research.
 

kyuuketsuki

Distinguished
May 17, 2011
267
5
18,785
Somebody correct me if I'm wrong, but my reading of the article was this:

Yes, the conductivity decreases as temperature increases with this material... as it does in pretty much all conductive substances, copper included. Now, if you'll pay attention to the last paragraph: at 77C, the copper/graphene material had a conductivity of 440W/mK. Now, at *27C* (much cooler), pure copper has a conductivity of 380W/mK. So even at a much higher temp of 77C, the copper/graphene had better conductivity than copper did at a much cooler temperature.
 

danwat1234

Distinguished
Jun 13, 2008
1,395
0
19,310
[citation][nom]Bloob[/nom]It means we might see passively cooled 78xx-series... ( or more likely 88xx or 98xx )[/citation]

No the heatsink still needs air flow, unless the heatsink is absolutely massive. Otherwise, yes thermal conductivity between the GPU and memory and the heatsink would be great, but the heatsink would be at or above 100 C and so would the chips!
Yeah I know I'm a smartbutt and you were joking but I'm a nerd.
 

freggo

Distinguished
Nov 22, 2008
2,019
0
19,780
[citation][nom]zingam_duo[/nom]Do you travel by tanks to Germany, because we travel only by tanks to France?[/citation]

Last time I was in France (driving a rental car with German plates) I hat eggs thrown at me/the car.
Makes one wonder how the two countries deal with their military past.

Especially seeing how the 'looser' Germany is doing substantially better than the 'winners' France and UK !

 

youssef 2010

Distinguished
Jan 1, 2009
1,263
0
19,360
[citation][nom]shin0bi272[/nom]"The conductivity of copper-graphene increases with lower temperatures and decreases with substantially higher temperatures. Kasichainula said that the conductivity was 510 W/mK at -23 degrees Celsius and 440 W/mK at 77 degrees Celsius."So the law of diminishing returns strikes again. What's the conductivity at 100C? Since some hardware gets above 77C...If its worse than copper this should be relegated to lower power devices. Since you know carbon isnt known for its heat conductivity.[/citation]

Carbon in the form of coal won't conduct heat. But carbon seems to be the new wonder of the world. I'm beginning to feel that this little atom can do almost anything once you find the proper arrangement and the proper isotope for your particular application.
 

gsacks

Distinguished
Jul 31, 2008
176
0
18,680
There has also been research that I recall being published last year regarding nano-fans build right onto the heat sinks, eliminating the need for conventional fans. I would think that the combination of nano-fans and graphene would work wonders. Of course you still need some kind of fan to move air through the system, or an external heat-sink.
 

moonzy

Distinguished
Jan 10, 2011
84
0
18,630
You don't beat Copper, you join it.
Aside from Diamonds and Silver, Copper is the best metal for thermal conductivity.

I honestly don't think I'd want graphene & copper cooler in my PC cooling my CPU.
Wouldn't a plastic type material be more prone to leak more EMR?
 
Status
Not open for further replies.