GeForce GTX Titan Inbound, Already Listed at Online Retailer

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]Reaver192[/nom]Sign me up for two...... One for Netflix and the other for counter strike force!![/citation]

Counter-strike force, recommended specifications: Pentium 4 processor (3.0GHz, or better), 1GB RAM, DirectX ® 9 level Graphics Card

Overkill, much?
 
this is nvidia's answer to new consoles' launches.
very likely to be a paper launch and will be available in limited quantities.
(rumored) outperforming gtx690 at 235watts would be quite impressive if it turns out to be true.
 
[citation][nom]de5_Roy[/nom]this is nvidia's answer to new consoles' launches. very likely to be a paper launch and will be available in limited quantities.(rumored) outperforming gtx690 at 235watts would be quite impressive if it turns out to be true.[/citation]

No, no it's not. You don't need a Titan to match the power of a next-gen console. Anything better than a 7850 along with a low overhead OS configuration and you're already matching or out pacing them.
 

in terms of gaming performance, it may be true, but it seems more like nvidia re-assuring their market presence in gaming. keep in mind that the consoles are heavily rumored to have customized amd hardware.
 
[citation][nom]pckitty4427[/nom]And why is there a picture of a Tesla GPU?[/citation]

There is a picture of the Titan, but due to NDA, it was all blurred out.
 
My 580 cant really run all games full boat and I dislike turning things down so I have been following this card. But at that price, and knowing I may have to buy 2 consoles this year, I may just wait on a card upgrade. Also cant sli cause psu is only 650w and mb is cheap. Plus pc games may soon blow up nice with the new consoles and need much more muscle than currently.
 
[citation][nom]cats_Paw[/nom]Great, more name changes to confuse everything...[/citation]
Anyone who is actually planning to get this, is hardly going to be confused. And if they ARE confused, umm... they probably don't need it..
 
If this card exists I would probably get it and combine it with my GTX 590 for OpenCL type work. The 680/690 is a step backwards for compute compared to the 580/590. I accept all the reviews to show the Titan either retaking or coming close to retaking the compute crown from AMD.
 
[citation][nom]susyque747[/nom]Now since the new kiddie consoles are about to come out maybe they will make some games that will give this monster card a work out.[/citation]

Yeah, kind of fun that even when the new consoles come out they will likely only be 0,5x of of a high end PC and a few years down the road they will be 0,1x as powerful like "the current" consoles vs gaming PC's!
 
I'm waiting until the end of february, if this card performs as rumored I'll be more than happy to drop 900 bones on this beast
 
[citation][nom]magicandy[/nom]Because Tesla is still the name of the brand Nvidia calls their GPUs?http://en.wikipedia.org/wiki/Nvidia_TeslaKepler and Fermi are both among the Tesla brand. Titan is a code name, like Kepler. Tesla is the overall brand.[/citation]

Tesla is the brand name for Nvidia's top-end compute cards. Titan is not a codename like Kepler; Titan is simply the card's name according to Nvidia.
 
[citation][nom]LordConrad[/nom]I'll stick with my GTX 580, it has excellent graphics and compute performance. Best of both worlds.[/citation]

The point of this card seems to be succeeding the 580 in a best of both world's position from Nvidia. The 580 assuredly won't keep up in compute nor gaming.
 
[citation][nom]LordConrad[/nom]I'll stick with my GTX 580, it has excellent graphics and compute performance. Best of both worlds.[/citation]
Yeah, & it consumes over 9000 watts of electricity.
 
I prefer 100´s times a new design BASED on a GK110 than consumer recycling of defective chips.

A workstation card made on binned GK110 makes sense.Consumer version do not. A few reasons :

a)There is few GK110 chips. The design is so large that the silicon wafer do not carry many dies.Even before binning.Also Nvidia did not make many productions runs as the Server chips market are smaller than consumer.

b) A workstation card based on GK110 will kick ass.Will blow out of the water any Quaddro (or FirePro) currently aviable. Just add a Framebuffer chip and a cooling solution (Server cards do not have, the room itself is cooled)

c) The FPUs in the cores are double precision capable, great at CAD/Compute; useless at gamming.

d) The GK110 do not include a geometry preprocessor. Getting a DX11 certified driver,will require Tessalation to be done in software and that is slow.Perhaps a Xilinx programmable chip.Or a custom tailored solution but volume may not justify the resources needed.


In short; Love it or Hate it; Nvidia has always delivered better Tech in a Year design cycle than just add a Framebuffer/FPLA to a underperformer chip.


Or at least; is what I hope.
 
[citation][nom]mamailo[/nom]d) The GK110 do not include a geometry preprocessor. Getting a DX11 certified driver,will require Tessalation to be done in software and that is slow.Perhaps a Xilinx programmable chip.Or a custom tailored solution but volume may not justify the resources needed.In short; Love it or Hate it; Nvidia has always delivered better Tech in a Year design cycle than just add a Framebuffer/FPLA to a underperformer chip. Or at least; is what I hope.[/citation]

Or a CUDA program running on some of the cores to do the same thing.
 
[citation][nom]Draven35[/nom]Or a CUDA program running on some of the cores to do the same thing.[/citation]

It may, but is only the tip of the Iceberg. The name changes in time and manufacturer but processing units like the z-culling, texture compression, Fast zeroes and more important :the thread dispatcher are not present in the GK110 or more precisely is not optimized for graphics.

A dispatcher on a gamming card must divide the frame into tiles and group the vector data (geometry), the scalar data (texture,especular map,etc) and the shader(s) program that may apply into that tile.Then send the dataset to the cores for the final pixel color calculation.

Has been perfected (spending millions of dollars) by trial and error to the point the pipeline is working on 2-3 frames ahead of the currently displayed.None of that is present in the GK110, it was not needed and Nvidia will not put that in the drivers.It will open his more valuable industrial secrets to the competitors (and the linux community).

Also GK110 do not have raster backends (ROPs). The output from the cores is useless if is not write as pixels. Technically 32 bits sets, in burst writes by tiles and displayed as full frames. If you try to write a frame using individual bytes, the memory bandwidth will sink faster than the Titanic.

If the technical hurdles are not enuff to ditch the idea of using binned chips, lest examine the economics:

a) Any card based on binned chips will have an unstable availability. It will nose dive after the launch.Early adopters of the GTX 680 remember how much "fun" it was to click the refresh button at NewEgg.

b) With the foundries working nearly at the top of their capabilities with no production runs to spare, Why anybody in the world wanna produce a very expensive die and then cripple it on purpose Just to meet a low profit demand ?. Idiotic

c) Why to produce a card that will be considered "Expensive" for a grand when you can produce a workstation for Two Grands+ and be considered a bargain?.Even if is a clearance sale.

In fact the GK104 double precision performance is so bad than if GK110 sees the day as a $1,000 card, I don't think gamers will get many.CAD users and Hollywood will eat them.
Paired with a 6<->8 cores PC it has to be the best deal for Raytracers since the Amiga+Video Toaster.


What I expect is a new design (call it GK111 or whatever) with all the missing bits and pieces than can be brought at will by gamers.

But I keep day dreaming of a GK110 on a workstation....
 
[citation][nom]TheMadFapper[/nom]No, no it's not. You don't need a Titan to match the power of a next-gen console. Anything better than a 7850 along with a low overhead OS configuration and you're already matching or out pacing them.[/citation]

You are right, but then again, one of the very reasons people game on a PC rather than on a console is because they have access to better hardware. I don't expect to match a next-gen console, I expect to overcome it before its launch.
 
check this out
http://wccftech.com/nvidia-officially-unleashes-geforce-gtx-titan-gk110-gpu-decimates-single-chip-gpus/
 
Status
Not open for further replies.