GeForce GTX Titan Black Edition and GTX 790 Specs Leaked

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

anthony8989

Distinguished
The VRAM is not the only reason why 290x beats out the 780TI, Higher ROP and larger memory bus. Sure the 780TI has higher clocked GDDR5 but I suspect that the timings are not as tight as the Titan, hence the reason why Titan can smash the 780TI is benchmarks.

Like the the 680 vs 7970, the 680 was a little faster but as time went on the 7970 became as much as 15% faster. The same will happen this generation, in my opinion the R9 will be the fastest single GPU out unitl next gen 20nm parts.

The 780ti and 290x are even at best. The 780ti generally wins at 1080p, ties at 1440p and loses at 4k+ to the 290x. This is because the 780ti's specs are targeting 1080p gaming primarily, whereas the R9 shoots towards 4k+. However, 4k gaming at ultra settings with a single R9 290x still isn't ideal. The card truly starts to show its stuff when it's in crossfire - when the higher bandwidth and VRM counts for more. Two R9-290x's in crossfire would be more suitable for 4k+ gaming, just as a GTX 780 TI (SLI or not) would be more suitable at resolutions lower than 4k. The goals of the cards are too different to say that one is totally better than the other - with regards to gaming performance at the very least.

GTX 780 TI is a better built piece of equipment, a quieter card, uses less electricity, and typically has more stable clock rates.

The R9 290x would be - or should be - cheaper, have a greater overclocking potential, and better performance at the highest available resolutions. Power tune permitting.
 


Pointing out such things to someone on the AMD influencer program is pointless.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


ROFL. I was thinking he seems ticked...IF I had the money I'd be really thankful either side came with a $1000 card that wins everything. Since they sell out (even AMD on launch, no matter what the reason) why not put the card out for those that clearly have the means to pay for it? The expensive crap some of us can afford pays for the REST of us to get decent upgrades yearly. I might think it sucks I can't afford overly expensive product X, but I sure am glad it exists just in case I can pony up the cash at some point :) NV said they sold every titan they made so far just last Q so they still sell well. The new version will probably sell to the same people again as it's a heck of an upgrade and most likely sucker a few more that were on the fence last time. These are not just for crowns now, they actually sell in enough numbers to be a valuable product in the portfolio.

I don't think I'll ever be able to afford a "lambo" but if I ever could I'd buy one. He also acts as though the only card available from NV is $1000. Both sides cover the whole range pretty much (7990 anybody? Asus Rog2 etc that were $1200-1500 on AMD's side). I don't see AMD stealing anything as their quarterly report just showed last week. If they are having share gains in gpus, they are making nothing on them because the quarter shows only console profits were made. AMD fanboys are currently paying a HIGHER than originally launched price for their cards too, so I see no difference between them. 290/290x are $500/600 not 400/550. I hope both sides have some "real" stuff coming this year and every year until I die... :) I don't get his points, and it seems like he just needs a job upgrade or something ;)
 


Be careful with your slurs, I might take offence. I also have both AMD and Nvidia cards though so once again you're wrong! :lol:
 

redeemer

Distinguished


I wouldn't say so, aftermarket 290 gpu have right all the wrongs of the reference design. The 780TI is still more expensive and only 3GB of VRAM. As far as clocks go, the 780TI can be pushed higher and that's because of the GK110 revision B. The thing is the R9 has unlocked voltage from day one, so under water the 290 will blow past the 780TI.

I have both cards, just waiting on waterblocks
 

redeemer

Distinguished


You called me an AMD influencer..isn't that a slur? So which AMD card do you have? I cannot believe you have an AMD card..post a pic?
 


So what power sockets are on that card then? I asked a while back but you still haven't answered that.
 

redeemer

Distinguished


The limitation is only with the specification that manufacturers should adhere to, if the card demands more power it will draw it plain and simple.
 

The card can only draw as much power as its connectors allow, that's the plain and simple truth. And as you refuse to say what sockets are on that card you allegedly have I have to assume that you know it can't draw more than 300w without melting cables and the PCIe slot! :lol:
 

redeemer

Distinguished


Sorry but that's not true

1,187V, 330W TDP with 115% is already 380w. So if I crank up the volts to 1.21v and power target up to 200%.. take a guess?
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


Where does Titan smash 780ti? I know of 0 games the 780ti loses in to Titan. The old one loses everything but DP FP. The new one well that's 2 gpus so not the same story clearly.

"R9 will be the fastest single GPU out unitl next gen 20nm parts"

It's not the fastest right now and won't be ever without a new rev. You're also forgetting once 7970 became competitive a 780 replaced the 680...Creating the same situation again...LOL. Your reality distortion field is set too high. ;) Also it didn't become faster via magical drivers, they created a GHZ edition to catch up. Drivers help both sides, but OCing your cards is not the same thing. But here even the superclocked cards don't help as hardocp has tested one clocked to max vs. merely a REF 780TI maxed and 290x got it's butt kicked. Add a NON REF superlcocked 780ti to that race and the story just gets more slanted to NV. This won't change unless AMD can put one out that has more cores turned on or something (doubtful with heat/power already being a problem and 94c) like the 780 vs. 780ti.

You're not going to magically outgun NV's driver team with 30% fewer employees working on this stuff than you had two years ago at AMD. I think consoles put AMD so far out of money for a few too many years, that they will never recover again and will shortly give up the gpu race just like cpu. Until I see "AMD makes 250mil this Quarter NET and hired 10% more engineers and 10% driver team", I don't expect them to ever get back in this. We are seeing the end of a 5yr pipeline that will now show where the resources got truly shafted to do a console chip. While they were designing console crap they gpu stuff got put on hold, drivers got put on hold (same as BF4 taking over all teams until fixed at EA DICE) etc. Doing that caused massive losses (650mil in TTM).

http://investing.money.msn.com/investments/financial-statements?symbol=US%3aAMD
Look at profits. They were in the gpu race big 2009-2011 where they made money. But resources diverted then show up in 2012/2013 where they lost (2013 FY isn't on there yet, but if you look at their quote on front page, you get they are looking at ~600-650 TTM loss for FY2013) and will get worse for a while at least as we see how far the pipeline that WAS producing money started to suffer. Hardware was in the pipe and done while drivers come last, which is why the hardware is OK but the drivers were the main problem that showed up last gen as engineers exited. Going forward we'll see both get hurt unless they start hiring to get back in the game (can't do that without profits). It is very difficult to catch a competitor 1/3 less resources than when you were IN the game just 2yrs ago. In order to pull that off you have to pull a samsung, make a crap ton of money so you can out R&D the other guy on pure brute force cash. Samsung would have no chance of catching Intel without the profits they are making now allowing major R&D (they are #2 in spending now, with only VLKAY beating them I think).

Turn off your reality distortion field or explain how they financially pull off what you're saying ;)
 


I don't need to guess, PCIe slot = 75w, 8 pin PCIe from PSU = 150w, 6 pin PCIe from PSU =75w giving a grand total of 300w.

Check the PCIe 2.0 specs on this PDF :- http://www.pcisig.com/developers/main/training_materials/get_document?doc_id=b590ba08170074a537626a7a601aa04b52bc3fec

No mention of 150w from the PCIe slot.

Copyright © 2007, PCI-SIG, All Rights Reserved 39
PCI-SIG Confidential
Delivering Power to Cards
A 300W Graphics add-in card can receive power by the
following methods:

75W from x16 PCIe connector plus 150W from a 2x4 connector
plus 75W from a 2x3 connector.

75W from x16 PCIe connector plus 75W from a first 2x3
connector, plus 75W from a second 2x3 connector, plus 75W from a third 2x3 connector. Note that this is not the preferred approach.
A 225W Graphics add-in card can receive power by one
of the following methods:
75W from x16 PCIe connector plus 150W from a 2x4 connector.

75W from x16 PCIe connector plus 75W from a 2x4 connector
plus 75W from a 2x3 connector.

75W from x16 connector plus 75W from a first 2x3 connector
plus 75W from a second 2x3 connector.

Notice how it says 75w from the x16 PCIe and not 150w?
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


LOL...I meant he wants to KILL their company like they killed his chipsets and is young enough to be around to see it if he stayed for 10-20yrs (it's no secret he loves graphics and what he does, with no sign of retiring). You can see he loves what he does every year when he gets on stage. But it's hard to get over that kind of crap (Intel killed their product line on chipsets, ouch) so he may just give them the bird and ride it out. Ideally as a stockholder I hope Intel makes the move. But as a end user this will move us back to PC's being $2000 for a decent one and $3000+ for great ones like the 80's and we'll slow to a crawl again as they take over everything running either x86 or multi-die shrunk 10nm NV socs (you skip the whole 20nm, maybe even 14 and make the first NV/Intel part 10nm). This will be aweful as users for us long term, pretty cool short term for many though on low-end and even power users for a time as we get 10nm gpus faster.

I hope people understand it was all about numbers and money in that mile post (LOL), not who I like as this would be VERY bad IMHO. This is the implications of a NVDA purchase by anyone here. Currently I wouldn't touch Intel stock, but if they buy out NV and can get by their HATE stage, I would keep my converted NV->Intel stock from the buyout for the future. I don't see how anyone but apple could change that story who are actively trying to kill x86 by removing Intel from the products, and moving iOS/MacOS into one ARM running OS at some point (iOS8 or 9 I think). They bought PAsemi (swift cpu came from it's engineers largely) and surely are working on an in-house GPU or will buy someone soon. I suspect IMG.L who is very good and VERY cheap. Google could change things also. I guess anyone buying NV can make some real changes happen, but the best guy that has a chance to abuse that is a person who already is dominant position in fabs and 90% of cpus on computers. Buying a great ARM soc, #1 GPU, stealing fab share forcing maybe TSMC to be a bit idle (gpus are big), getting a modem on chip sooner (you own T4i then too) etc. etc...If google bought NV they have motorola/nexus lines, and could put out a REAL PC instead of a chromebook then too. But they wouldn't have quite the shot that Intel would have unless they announce that & that they are breaking ground immediately on a FAB to make it all. With Intel, you have ARM power levels, NV gpus perf, on Intel process would be powerful. You have Intel hardware modem, or NV software modem. Intel cpu, or NV Arm denver cores etc...How do they lose with this move? Talk about choices and not caring how you win. Linux, Android or Windows, you couldn't care less then. You are covered every way that can win. SCARY.

Basically NV is ripe for the picking if they can get him to sell out (difficult with a person who is passionate about their business, not just doing to sell out later for billions). There are people who create companies to sell them for a billion when lucky and there are those who create it to RUN it forever and love it. I hope Jen Hsun is the latter and he seems to be or we go down this ugly Intel buying them road. We'd be ok as users for a few years until they dethrone everyone but eventually it's crap for pricing. I think it's the same for ANY purchase of NV from the dominant players. We want them as they are now or someone we don't want running the show will be ruling our devices like ATT etc rule our mobile experience. I hope they fail in their 100Billion bid or they get even more powerful and slow progress in mobile use even more (caps power goes up as the cap more countries then). There is a reason Reed Hastings (netflix CEO) sees them all as semi-friendly now but we may need regulation to stop them soon. He's right sadly if they get more lobbying power etc. Right now I'd rather see a google buy of NV as I think that gets us a good 10yrs of cheapo products with lots of power, vs. probably half that with Intel. All will turn evil eventually, but google probably lasts the longest as our semi-friend (no fab to totally rule and getting one would take a few years to enter the game, like apple will take a few years for it to matter). But INTEL? Dominance quickly with 10nm GPU's and SOCs via an NVDA purchase. That sucks. I fear there may be a price (2Bil? 3Bil?) that would make Jen say, screw it, you can have my company. Intel could pay $18B on the purchase and 2-3B to Jen (netting him 10x his current wealth maybe) an laugh pretty quickly after 2015 and 10nm everything. ARM or x86, Intel wins a LOT of crap with both in house and the top gaming gpu. At the same time they could root for Android/Google and finally get to stab Microsoft with hardware (both are FRENEMIES, but they really hate each other and want to rule it all). They have never figured out how to take each other out, but have fired many shots at each other.

It's comic how all of this is like watching a HUGE soap opera play out...ROFL. Who hates who more? Or enough to try to use NV to kill old enemies? Google, MS, Intel, Samsung are all potentials for various reasons. MS owns a phone now, but no fab so it would help but not sure they win either without a fab at some point. Getting VERTICAL as much as possible is the name of the game now and that make NV worth it to all of these if they can't beat them with the gpu they have moving to mobile (which is like a Qcom modem dominance of the past). Samsung or Apple can do this with barely 2Q's of profits. Nothing to them for what it gains you immediately in gpu share, great SOC, Denver for xmas with 64bit etc. Qcom totally overlaps with NV so a waste. But apple/Intel/Samsung needs a great gpu/modem (intel has a modem on die at some point, but NV gets it quicker). All would love their gaming experience and cpu overlap a little wouldn't matter to Apple/Samsung (they can afford that), for Intel they gain a ARM soc too.

If Intel buys I know who wins and have a pretty easy stock move as they are a perfect match. The other 3 are very interesting and would confuse the crap out of me without a lot of homework and thought on how it turns out. Which is where most of this came from, we just got Intel/AMD's quarterly reports so I posted this stuff out of all that info produced for me. Hope it helps someone else make money, or tear it apart so I get smarter if I'm missing data. Samsung would have a decent ride here too, but Intel has the whole x86 side to totally hedge bets here. Nobody has that AND great fabs but Intel. Any buyer of NV at worst enhances their chance to survive as mobile changes our world. But Intel is almost a shoe-in if they make a move soon. Can they get over the hate or pay to get Jen over it? Hmmm.....Exciting times. Is the move to better iGPU in a HUGE portion of pc's over the next years worth Intel pricing us to death at some point? Not sure. I think it's bad as I buy high end, but good for low end people which all get upgraded by a far better IRIS rev2 or whatever. I think it would be much higher than a 640. Imagine on die Volta for a second, as even Maxwell has HSA now (and cuda6 supports it too in prep for maxwell) ;) I'm thinking you get 750+ like perf at 10nm maybe more and the implications are huge. It will be better than AMD's APU on gpu and be 10nm. Look at Kaveri for an idea of a 10nm Broadwell with Maxwell inside. Intel would surely be able to double whatever NV thinks they'll have in volta on die mem with Intel's 10nm right (check out Iris already with crystalwell)? Volta supposed to be 16/14nm, now it become 10nm over night...LOL. WOW. Exciting yes, but I fear the worst in the end. How would qcom etc fight a 10nm ARM soc from Intel?

I really hope to get more posts like yours looking at the data, rather than the LMFAO who has no point ;) Thanks for chiming in. Then again a dozen of those just tells me I'm right, there is no argument that can be made. I don't quite understand what he's LMFAO about? AMD lover? Is he an Apple lover? What was his problem with the post? He an Intel hater or something else? Maybe he means they hate each other so much it can't be done? Is he laughing because he thinks I'm crazy or totally gets how Intel could easily rule here? ROFL. I'm confused. Every time we get to quarterly reports season I go over my stock picks again (you have to or get burnt). I posted this here because some of the financial sites don't produce more than fanboy posts, so I take shots here too. Surely there are some techies in here with money in this stuff and unlike financial people, more of YOU people understand the actual tech involved so have a different perspective than a Q report reader to some extent I'd hope.

http://www.phonearena.com/news/Nvidia-Tegra-K1-smokes-the-iPad-Air-and-Snapdragon-800-in-graphics-benchmarks_id51322
Two die shrinks (which is what Intel would give here) would surely take out more than the 740m right? I'm thinking a 20nm version already catches this (maxwell version, don't know if Denver will be 28 or not at xmas), let along Intel's 14 or 10nm.
 

redeemer

Distinguished


Wow so much hostility, obviously you didn't read the post. No where did I state that the 780TI loses in games, I said that the Titan beats out the 780TI in benching and overall is the better overclocker.
 

redeemer

Distinguished


This is what I am trying to explain to you, theoretical limits from the 6+8pin + PCIe? They are just that, theoretical, a limitation imposed to manufacturers by PCI-SIG (Peripheral Component Interconnect Special Interest Group)

PCIe 16x can use up to 75 W (3.3 V/3 A + 12 V/5.5 A) PCIe connectors add 75 W (6-pin) and/or 150 W (8-pin) power for up to 300 W total (2×75 W + 1×150 W) Therorical limits!
Practical, well, try it yourself, get a power meter, a modded bios and OC that card, soon you'll see your power draw over 300W!

(GK110 =250W) but this is not a fixed value, refers to the maximum amount of power the cooling system, in this case a chip, is required to dissipate.

The TDP is typically not the most power the chip could ever draw, but the maximum power that it would draw when running "real applications". This ensures the chip will be able to handle essentially all applications without exceeding its thermal envelope, or requiring a cooling system for the maximum theoretical power.

TDP is meant to be the wattage of the processor at load. I say "wattage" because it is unclear if this is meant to correspond most immediately to how much power is consumed in watts, or how much heat is produced in watts, but as near as I can tell the TDP is pretty much meant to indicate both
 


No because your attitude and posts fit the profile.

And I do have an AMD card but unlike you I don't need to borrow a card and post a picture of it with a bit of paper because mine mine is actually owned and used by myself.
sepb.jpg

 


Hardware limitations are not theoretical and neither is the 300w limit that the connectors to that card can handle.
 

yyk71200

Distinguished
Mar 10, 2010
877
0
19,160


Quote: The six-pin connector uses two +12 V wires to carry up to 75 W, whereas the eight-pin connector uses three +12 V wires to carry up to 150 W. Although these figures are what the specifications allow, the wires and terminals of each connector are technically capable of handling much more power.

http://www.tomshardware.com/reviews/power-supply-specifications-atx-reference,3061-12.html
 


Having seen cables and cards go up in smoke due to excessive power draw I'm not convinced by that.
 
Status
Not open for further replies.