AMD Demonstrates First DirectX 11 GPU

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]jaydeejohn[/nom]I see alot of crapping on AMD here, and whats really funny is, the main difference between i7 and C2D is SMT, which really wont be enhanced by W7, but real core perf, where AMD does ok[/citation]

SMT? your kidding me right?
 
[citation][nom]anamaniac[/nom]This sounds wonderful, I hope AMD lives up to their promises.AMD is screwed in the CPU market right now...Let's hope that buying ATi was the right way to go.The better multi-core support is the part that interests me. I like ATi cards but I also like Intel CPUs.[/citation]

Theres nothing wrong with that to me, it shows your not a fanboy :)

[citation][nom]Redraider89[/nom]"AMD has about to November to release something that doesn't suck"So now Core 2 Duos suck right, since Phenom IIs are comparable to them.[/citation]

Well that all depends, how long has Intel had that old piece of tech for now?

 
Conroe... Aug 2006. Penryn quad (which has superior CPUs to the Phenom II line) Aug 2008. Deneb was Feb 2009... so, I guess that puts AMD about 6 months behind on releasing competitive mainstream. For the high end... they got nothing.
 
I bet that any decent DX10 card just need a driver update to support DX11, altought maybe AMD and nvidia will not release DX11 drivers for those cards.
 
[citation][nom]marraco[/nom]I bet that any decent DX10 card just need a driver update to support DX11, altought maybe AMD and nvidia will not release DX11 drivers for those cards.[/citation]

Considering Nvidia cards can't even support DX10.1 on the hardware level... doubtful.
 
[citation][nom]fulle[/nom]Considering Nvidia cards can't even support DX10.1 on the hardware level... doubtful.[/citation]
I mean that the most important features of DX11 are just implementations on drivers of today capabilities.

I already had run tesselator code on CUDA, on a 8800GT
 
Running tessellation code with a CUDA supported app, and having DX11 tessellation in games are so freaking different I want to slap you. Also, MS isn't going to tag a card as having DX11 support unless it supports ALL the DX11 features. Most doesn't cut it.
 
"Upgrade for Vista? Why didn't Microsoft ever release 10 for XP? Did they really think they could convince gamers to move to the untested Vista just by tying DX10 to the OS? Did anyone upgrade to Vista solely for DX10, and was it worth it?"

Actually, yes. The only reason I got Vista was because of DX10. Granted, it turned out to be not quite the advance I was hoping for, I still don't want to go back to DX9. And actually, minus a few issues with my Creative sound card not playing some games in surround at first, I haven't really had any problems with it. I am running Xp in a dual boot, but truthfully have not used XP since I got Vista. It is sitting on a basically empty drive.
 
[citation][nom]Kill@dor[/nom]That's very good news about DX11. Its time to put computing levels to the limit and beyond... I'm really getting tired of buying hardware that offers marginal performance with expensive costs. These companies need to realize we are still in a recession, we still want to purchase their products, but with fair capitalism. Props to AMD for first rant/bragging rights ^_^[/citation]

I read somewhere that in a technical aspect that ATI had the first DX10 card. But they also had the first DX10.1 card. ATI always pushes new tech sooner. They had a 512bit bus in the 2K series first and now have GDDR5.

[citation][nom]xyzionz[/nom]I'll stick with my dual 9600GT until the GTX3xx or DX11 since GTX2xx still utilize GDDR3, i read something about GTX3xx will be using GDDR5 which is a good news.And just hope that nvidia stop using the G92 as a base to design a new chip by throw in more SPs or transistors and moving it to a smaller die process, but i also read something about GTX3xx will be a cGPU design and use MIMD[/citation]

It only takes nVidia 1-2 gens to catch up to ATI. Next ATI will push a 512bit bus with GDDR5. Currently nVidia has to use a much more power hungry 512bit bus to get the same memory bandwidth with GDDR3 as ATI gets with GDDR5. Kinda crazy really.

[citation][nom]jaydeejohn[/nom]I see alot of crapping on AMD here, and whats really funny is, the main difference between i7 and C2D is SMT, which really wont be enhanced by W7, but real core perf, where AMD does ok[/citation]

Um what? W7 improves upon multicore utilization over Vista and seriously JDJ, SMT?

You know as well as anyone else that Core i7 has a lot more changes than just SMT.

And yes IF DX11 lives up to the ability to utilize multicore CPUs better then it will start to push games in Core i7s favor (or higher end Core i5) due to SMT and the obviously better arch that Core i7 has. Even with SMT off a Core i7 outperforms everything else in multithreaded benchmarks (C2Q and PHII) by a pretty good margin. SMT just enhances that by a lot.
 
[citation][nom]hellwig[/nom]Upgrade for Vista? Why didn't Microsoft ever release 10 for XP? Did they really think they could convince gamers to move to the untested Vista just by tying DX10 to the OS? [/citation]

They couldn't implement DX10 into XP because of the core of the operating system and how the Kernel tied in with the graphics display. In Vista (and subsequently win 7) the two are decoupled and allowed for DX10+ to be properly integrated and upgraded without the fuss of touching the entire core of the OS. What's remarkable about this is that linux has been doing this for many many years, MS actually implemented something from linux in their new operating systems.

 
The GDI can now also do real multi threading in DirectX11 and WDDM 1.1

There is a real advantage in performance and "ping" when multitasking, using multi threaded programs on a multi core system.
A lot of problem's where caused by the single-threaded design of GDI.
One application got the whole card for itself,
only one thread could actually have the whole card for itself but not more than one. (see the problem with multi threaded applications)
(it really slows it down)

And now 10-15 years later, this is finally done the way it should be done a long time ago.
Finally ms has fixed the retard that was GDI.
That is the reason why it can better take advantage over multi core CPU's.
 
Status
Not open for further replies.