Well, you probably cannot expect them to couple this with 'standard' computers.
It would be a disaster for the most part.
If anything, computers today are an embarrassment to what they could have been.
By now we could have had synthetic diamond as a material for microchips due to its cost effective viability for industrial creation in 1996 (which is when the process seems to have been 'perfected'). Patents of course slowed the usage of synthetic diamonds until 2004, and it wasn't until then that semiconductors out of diamonds were made (right after the patent issue was dealt with actually).
So, barring patents aside, we could have had insanely powerful computers today that would also suck up less power than the ones we have now (coupled with the premise of non-existent temperatures).
Now, add graphene into the mix (which is 2 to 3x better than diamond in every respect) at least in some kind of hybrid form, and voila.
But of course, the market will first introduce a possible silicon/diamond hybrid, followed by a full blown diamond computer and then of course hybrid of diamond/graphene before they finally switch over to graphene entirely.
Well, maybe it won't take too long... but given how the market operates, coupled with planned obsolescence (unless we change the economic model and force them to provide the best of the best as soon as its available with upgrades in mind that doesn't break down after a short term use, and can be fully recycled) then I guess we can start see some real leaps.