GeForce GTX590 at PAX East; Asus, EVGA Only?

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Shame that current sandy bridge chipset's only supports pci-e x8 per card if you pop in two gfx cards... and 4x per gpu = no thx. Rather go x58 atm but the SB cpu's are sooo nice.

Evil Intel for releasing such good cpu but such "weak" chipsets to it!
 
[citation][nom]al360ex[/nom]*snip* There also seems to be 8x256MB memory modules per GPU. This will allow for 2GB of dedicated memory for each GPU (4GB total). *snip*[/citation]you must need to go back to primary school, as I count 6 memory modules per GPU.. ie, 6x256MB = 1.5GB each, 3GB total, same as the GTX580 (per GPU).

You are aware that the memory chips are the biggest IC's on the board other than the GPU itself? oh and that green PCB looking chip in the middle.. The memory is positioned closely around each GPU.. The smaller IC's are the VRM's..
 
My guess is, if the 580 is roughly $500, the 590 will be closer to $800-$900.

Not quite double due to not being two independent cards tacked together like the old GTX2xx dual PCB cards.

Still, I'm quite happy with my GTX460 1gb SLI lol.. outpaces the GTX 580 in most cases (except >1080p resolutions) thanks to their healthy 850mhz OC 😵
 
Like Twist3d1080 said,they will be hand select GPUs that are running at stock speed on lower voltages(as we all know not all GPUs are the same).Maybe even GPUs that can take more voltage or go higher clocks that can be under clocked to stock speeds which would lower heat output. I sure so kind of stuff like that can bring the heat down.
 
the card is clearly not finished yet (no display port included), I hope thats HDMI port and NOT what it looks like (USB) on the left, and looks to have 2 X 8pin ports for power, they have also gone to quite some extent to re-design this board, the memory looks to be dual 384 bit, but say its 2GB RAM per GPU this means that each GPU can use 2GB RAM where are the 5990 (and 6990 possible) 1 GPU can access upto 4GB)
 
[citation][nom]Cash091[/nom]They typically do lower the clock speeds for the dual gpu cards. They did for the 295. Still, I bet this card will be an absolute POWERHOUSE and I can't wait for PAX East!!! I do notice there is no 3-Way SLi support. Would it be too much to ask for a Hexa-core GPU??!? Can this play Crysis 2?? On three screens??? IN 3-D!!!???!!! o_0[/citation]

DirectX only supports 4 GPUs to my knowledge.
 
It looks like there'll be 2 8-pin connector, for a total of 450w.

Someone already corrected the ram figure you listed, but this slipped through. A single 8pin PCIe provides 150W. Two will provide up to 300W, with another 75W coming from the slot. This gives it up to 375W to work with, not 450W.

I'm interested to see what the TDP will be listed as and how this market this. I thought to get certified it couldn't exceed 300W, but they are clearly giving it the ability to exceed this mark.
 
"It's unclear why this will be the case"
I'll tell you why...
It will cost $90,000 and no one will buy it due to the extreme price. I remember a day long ago when GeForce was cheaper than Voodoo. After buying 3dfx I think thier heads got a little to big and they just wanted to line thier pockets. The last time I was impressed with the cost/performance ratio of a geforce card it was the Ti series of GeForce 4 cards.
 
Shame that the current SB chipset won't allow for 2x16 pci-e lanes natevly. The P67 only allow for 2x8 and imo 1x4 / GPU in SLI with 2 of thoose babies = serious starvation!

Sure there are some solutions where they glue together pci lanes but its no where near the real deal. Until then ill keep my current sli rig ty!
 
funny stuff, what I wan't to know is...these nVidia co's...they want us to pay? price-point $700. i have 3-way, gtx 275's thats about my total, now they want to see if we'll up it.? screw them, 3D is tech from the fn 60"s, 1080p has been around since the 70's, t.v.'s are finally down to around $600. which is great, but, vga's costing more than an HD panel t.v.??? really 4'/8' this isn't jewelery,? BUY BACK OUR STUFF, Recertify, warranty used stuff... help us out, and i am done w/ EVGA, great cards, hadn't needed warranty, but customer service blOWS. makes no sense...life time warranty, they'll send you used stuff, why not sell it CHEAP is all i'm saying.
peace
 
so many comments blasting this card sayin its gonna be a heater and use too much power. One guy claims he's gonna get the 6990 cause it will use less power?????

What???? you do realize the 6990 is gonna be running 450w with its turbo on. Also the 6990 will have to have its turbo on and drawing 450W just to get close to the performance of the gtx590. The gtx 590 will draw 375watts while AMD pushed the 6990 well pass the PCIe spec just to try to match or beat it.

While the same ole stuff gets said about nvidia most AMD/ATI fans ignore the fact that their cayman is a power hog just as much as fermi. It takes a lot of power to run powerful GPUs with their complex shader. AMD has created a dual card that far exceeds the PCIe specs and now things have changed. The HD6990 will not be using less power than the GTX590, it cant because the HD6990 is absolutely pushing the power envelope to the max. Nvidia will not let the GTX590 draw 450w much less exceed it. At worse they will draw the same power but i doubt Nvidia will even push it that far because of all the back lash they got from the power usage of the gf100. But it wont matter to these same very ppl who put down fermi for high power usage. It will not matter to them one bit that the hd6990 not only uses lots of power but it far exceeds even the Max Power rating for the PCIe specifications.
 



I agree there are a lot of complaints about power but those people are missing the boat on these type of cards it really will not matter in any way how much power they need or use its an a$$kicking competition nothing more.
 
exactly, I mean if they can cool it, whats the big deal. High powered GPUs use a lot of power! Fermi came out sucking watts, but it was powerful. AMD has reached the power of the gtx 480 and now they are power hogs too. The truth is, it takes a lot of energy to make a GPU with such power. The 6990 will be AMDs most powerful GPU ever and its gonna suck a heck of a lot of power! Now all i want to see is what they can do with all them watts. Thats what it is about.

Really being on 40nm again there just isnt a lot of playing room. I commend AMD for their awesome dual switch. I find it a clever way to bring the absolute max to their customers. But I just dont think i am fully comfortable if it really voids the warranty to be in turbo mode. I mean it still would be nice to have the option but it might make me a little weary of it. If it is true, i think i would surely wait and see if its not gonna be problematic before i switched turbo on. Let some others be the test rats, lol!
 
Status
Not open for further replies.