Upcoming Nvidia CPU (aka Denver)

ratclifff4

Distinguished
Aug 14, 2010
60
0
18,630
Hows it going everyone? Has anyone heared of Nvidias' Project Denver? Supposedly Windows8 will support this cpu. I think it's cool that Intel might finally have some competition in the high end market again. I'm currently running a watercooled QX9650 @ 4ghz for gaming and it seems to be handling everything so far. The only upgrade I think I might be needing is an Nvidia Kepler 28nm when they roll out and then it's all about waiting for the "Denver" cpus to launch and get reviewed. I'm thinking by the time that happens Intels "Ivy Bridge" will be in its second generation, or the tock cycle of Intel and I'm wondering if Nvidia is going to blow their ship out of the water or will it just be somewhat competetive. What do you guys think?
 

ratclifff4

Distinguished
Aug 14, 2010
60
0
18,630


That is correct. That's why Win8 will be the first Windows OS to support it. I sure would like to know how many cores this thing is going to have and at what speed. Nvidia's pretty good about keeping secrets though.
 
If its an ARM chip its likely going to roll out for tablets and other mobile devices first, and probably only as a dual core, since there really is no justification for a quad in small devices, and adding ARM support to windows is just a good idea in general as it allows for even larger market share.
 

ratclifff4

Distinguished
Aug 14, 2010
60
0
18,630


I read somewhere that they are going to be making full blown desktop and server cpu's. They already have the Tegra2 for mobile and the like. I'll try to find the article I was reading.
 

bobdozer

Distinguished
Aug 25, 2010
214
0
18,690
It takes nVidia 6 months to a year to beat AMD's gpu's, and only then because they are much larger chips with higher power draw. In technology terms they cannot compete with AMD's graphic division which has less than half of nVidia's budget.

What makes you think this would suddenly change with a cpu? The best Nvidia would be able to do is match intel's gpu performance, ie not good enough by far.
 

ratclifff4

Distinguished
Aug 14, 2010
60
0
18,630


I just found a better article, straight from the horses mouth.

http://blogs.nvidia.com/2011/01/project-denver-processor-to-usher-in-new-era-of-computing/

Unfortunately sometimes the links that are posted on TomsHadware result with a blank page. I got this article from Nvidias web site. I got there by Googling (nvidia denver) and clicked on

Search Result: Project Denver” Processor to Usher in New Era of Computing « NVIDIA

Check it out, this thing might pull the rug from underneath Intel.
 

ratclifff4

Distinguished
Aug 14, 2010
60
0
18,630


From what I understand Nvidia has been working on this for a few years now and the information is just now coming out. Just like AMD they are going to the 28nm proccess so I'm guesssing they finally have the right tool for the job. Why is Bull dozer taking so long to roll out? Iv'e been hearing the name for years now. Nothing on the shelves.
 

bobdozer

Distinguished
Aug 25, 2010
214
0
18,690


Nvidia is full of it. The die shot they used of this "Project Denver" was nothing but a false colour photoshopped Fermi die shot. They don't have a cpu and even if they did it would be crap like their gpu's are. The mere idea of anybody believing this nonsense is bizarre and nvidia should concentrate on making their gpu's less hot and power hungry.

If you think intel or AMD are even slightly worried about this, think again. It's just the latest in a long line of white elephants coming out of nvidia HQ.
 

bobdozer

Distinguished
Aug 25, 2010
214
0
18,690
"Project Denver"

NVDA_Project_Denver_Die_689.jpg


"Project Denver" and Fermi.

youhavebeentrolled.jpg


Recoloured

rickrolled.jpg
 

ratclifff4

Distinguished
Aug 14, 2010
60
0
18,630
Thanks for the die shots you'd mentioned. I cant really speculate how a product is going to perform even if the lead engineer was pointing out the details step by step right in front of me with a picture. I'm hoping that Nvidia will release more info regarding how many cores gpu/cpu are going to be on this thing and at what speeds. I kind of wish Microsoft would elaborate a little more on weather they will support a whole slue of cores or just 4 or 8. I hope they tweak Windows 8 to support as many cores that are available.
 


Don't know why you are thanking him -- and you are correct that it is your thread (which in the OP your supposition was Denver might blow Chipzilla's ship out of the water).

If that ain't drinking The Jen-Hsun Huang Kool-Aid, I don't know what is :lol:

The question you must ask: Why does Jen-Hsun Huang allow 'cooked' presentations and puffery to take away what may be a decent product?


 

ElMoIsEviL

Distinguished
DISCLAIMER: I personally think that nVIDIA is the worst company in the IT industry in terms of their "moral" and "ethical" conduct. I think I should point this out before posting... keep that in mind I am biased when it comes to nVIDIA

I think it will be decent for Tablets and Mobile Devices but I don't see it being good for desktops. Why? because nVIDIA make the worst chipsets this side of YUANG GIANG (never heard of YUANG GIANG?? that's because I made them up... nVIDIA is so horrible my imagination cannot even keep up with how bad they are at making chipsets).

I recently retired an AMD Athlon64 X2 3800+ Socket 939 nForce4-SLI rig because of nVIDIAs piss poor chipsets and chipset drivers.

I replaced it with a Pentium D 3.2GHz which outperforms the Athlon64 X2 under Windows Vista and 7 due to the horrible SATA/IDE drivers from nVIDIA.

So... that being said... how are nVIDIA going to implement driver cheats for their CPUs? Will they round out results in order to cut down on the processing times? Hehehe.
 

ratclifff4

Distinguished
Aug 14, 2010
60
0
18,630


I dont know what "chipzilla" is. I google it and didn't find anything that relates to any kind of chip. As far as "the question I must ask" goes, that doesn't make any sense. Take away what from what??? I dont get it. What does that even mean?
 

ratclifff4

Distinguished
Aug 14, 2010
60
0
18,630


You should buy 3 gtx 480's and run them in tri-sli
 

ElMoIsEviL

Distinguished


Godzilla = Big/Enormous and Dominant.
+
Intel = Semiconductor company or "Chip" company which commands a big/enormous/dominant position in its respective market space(s).

= Chipzilla.
 

ratclifff4

Distinguished
Aug 14, 2010
60
0
18,630


Got it!
 

bobdozer

Distinguished
Aug 25, 2010
214
0
18,690


I laughed. :D
 

ratclifff4

Distinguished
Aug 14, 2010
60
0
18,630


If you hurry you can still get the gtx 480 at EIO.com for $540 each. Get 3 of those and you'll have a slamin system!
 

ElMoIsEviL

Distinguished


I don't have a problem with their GPUs overall. I have a problem with everything else they do.

Client side PhysX is so innefficient that it is useless to use (unlike server side PhysX which works great). I mean 112 CUDA cores not able to properly process a bit of smoke while the Ghostbusters CPU Physics engine can do wonders with a simple Quad Core processor?

My take is that their CPUs will be horrible as will their chipsets. Trying to bring ARM technology into the mainstream is an exercise in futility. ARM platforms are great... for low power mobile devices. Their RISC-like architecture is not tailored for consumer-grade desktop use and especially not tailored for server/multimedia usage.

nVIDIA is trying to get around that with Tegra (and its other derivatives). I'll admit these things do get my attention as I am an Android user but I don't want to see AMR/Tegra/CUDA take over x86/OpenCL/Direct Compute.

Either way... it would be nice to see nVIDIA take a few more hits profit wise.
 

ElMoIsEviL

Distinguished
ElMoIsEviL - upgrade your phys-x card..
I run a GTS 240 (oem) as a phys-X card, single slot and one of the better performing single slot cards they got/ had at the time, still is..
main gfx card is GTX 560Ti.

Point is that we shouldn't have to upgrade our PhysX cards that often. 112 Cores ought to be more than enough to power PhysX.

That being said... I've since ordered a Galaxy Geforce GTX 460 1GB (Single Slot) card from Newegg.ca
 
Okay, I am really not sure what to expect from this now. Will the Nvidia CPU probably be better than AMD CPUs? What is an ARM CPU? Since Nvidia has been doing GPU stuff for a long time, and still has a lot of issues, why would they try to be competitive in the CPU market? Where should there CPU place in between Intel and AMD too?
 


If its anything like their mobo chipsets, I wont buy it.