• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

Nvidia: DirectX 11 Won't Define GPU Sales

Status
Not open for further replies.
[citation][nom]Curnel_D[/nom]"Nvidia says that special-purpose software relying on GPGPU will propel GPU sales, not PC gaming."This is as good as admiting defeat.[/citation]
GG Nvidia, it was nice knowing you...

ATI is definitely going to be my next graphics provider now.
 
Seems a little Radeon biased. Radeon supports OpenCL but not CUDA. NVidia cards support CUDA butnot OpenCL. Why is said as though OpenCL is some huge oversight on NVidia's side but not the reverse? Am I missing something?

I admit the DX11 lack from NVidia is worrisome though.
 
Nice one Nvidia, say something you aren't developing won't make a big deal and make it sound like other people are wasting their time... Pretty weak move.
 
Who the eff cares about DX11? Games that are compatable with DX10.1 are few and far between. The GPU tech has gotten beyond the games. You'll end up with under-used Raedeon 58xxs with 1 or 2 games that support DX11. Nvidia is going after ATI's mainstream distributor share. They are trying to cut into Dell, HP, Sony to use thier GPGPUs instead of a dx11 no one can utilize. Just like ATI is making DX11 cards to try to cut into the gaming market. Not gonna happen, people just bougt thier GTX series card or thier 4800 card, they aren't gonna upgrade just for DX11 any time soon.
 
nVidia: We cannot meet DX10 specifications, can you lower them?
nVidia: DX10.1 isn't a large enough upgrade to support.
nVidia: DX11 isn't important.

Seems to me that if nVidia can keep the 2005 market it would be fantastic. They didn't do their R&D, now they are paying for it. My bet is that AMD buys nVidia if they go far enough down. Then AMD can claim gaming.
 
it appears nvidia doesn't have a direct x 11 gpu yet and is talking out there you know what. of course people buy 500 plus dollar video cards for transcoding video to there hand held devices quicker not to play video games
 
That reminds me when the guys at Voodoo graphics sneered at nVidia's 32bit color support and said it was just overkill, and 16bits colors were enough.

Since nVidia has been on the other side of that stick, I can only assume their are trying to stall potential buyers and persuade them to wait for their own directx11 cards.
 
Ohh ya, DX11 is backwards compatible with DX10 and 10.1 like DX10.1 was. It just doesn't make any sense not to support it as the majority of cards now support atleast DX10.
 
here is how it works out if nvidia can't support opengl then they can't play certain games like emulators based on opengl and will have a real hard time with anything to do with graphics in linux period
 
Maybe if NVIDIA had produced a new GPU in the last year, the market would look better for them. They haven't had any new technology since the G200 came out, and that was ONE single chip. Meanwhile ATI kicked their butt price/performance wise with the R700 series (which comes in at least 4 variants). And before you mention the the GeForce 9000 series or any GeForce 250 or lower, remember, all of those are based on the G80/G90 series chips.

Anytime you have to reduce your product price 50% or more (as NVIDIA did after the Radeon 4870 came out), indicates a serious over-estimate of value on the part of the company. Add in the heat-induced failures plaguing numerous laptops with IGPs, and the relative failure of the PS3 console (the only latest-gen sporting NVIDIA graphics), and NVIDIA has to do some serious rethinking.

Maybe this comment about DX11 is a product of that rethinking. Maybe they've given up on consumer graphics and really just aim to push GPGPU towards the science and research sector.

Its sort of a shame to see them go. While I switched to ATI with the Radeon 4850, every GPU before that had been a GeForce.
 
[citation][nom]FlayerSlayer[/nom]Seems a little Radeon biased. Radeon supports OpenCL but not CUDA. NVidia cards support CUDA butnot OpenCL. Why is said as though OpenCL is some huge oversight on NVidia's side but not the reverse? Am I missing something?[/citation]

Yeah, you are. The two mentioned in the article are supposed to be open standards that anyone could make something compatible with, and therefore anyone making software can use to write software that will simply run as long as the stuff the end user is running is standards compliant. They don't need to worry about what specific stuff is there. Just like most of the time stuff for 64-bit CPU's are mainly written to just the subset of commands that AMD and Intel can both run.

CUDA is just some shit that nVidia made up that only runs on their cards period, and now that an open implementation is coming it will die. Just like ATI's stream stuff which is the same thing, just with ATI only.
 
[citation][nom]FlayerSlayer[/nom]Seems a little Radeon biased. Radeon supports OpenCL but not CUDA. NVidia cards support CUDA butnot OpenCL. Why is said as though OpenCL is some huge oversight on NVidia's side but not the reverse? Am I missing something?I admit the DX11 lack from NVidia is worrisome though.[/citation]
I believe CUDA is a closed nVidia thing, but OpenCL is Open standard anyone can implement, and not working with open standards is a bad thing.
 
Honestly, I'd feel a bit of sympathy if Nvidia hadnt been the trash talk drama queen of the tech world the last few years.

For the last couple years now, they've had to really flog Cuda and PhisX to cover up their lack of research and development. Now it's finally catching up to them, and then they feed us some stupid crap about how GPGPU computing is more important than gaming.

They're not just losing at this point, they're coming up with real stupid excuses right after finishing their trashtalk.

After their rip off rebranding scheme, their GPU failure issues and overall shoddy buisness practices, I'm finished with Nvidia.

If intel knocks them down to third place in the discrete market with labaree, then good riddance.
 
does it really matter that they don't have dx 11 cards yet?
The first couple of cards haven't ever been good, even for ati
with dx10 the 3850 and 3870 sucked
 
[citation][nom]Ehsan W[/nom]does it really matter that they don't have dx 11 cards yet?The first couple of cards haven't ever been good, even for atiwith dx10 the 3850 and 3870 sucked[/citation]
DX10 and DX11 have alot of similarities. And the 38xx series hardly sucked. It couldnt hold a flame to the GTX2xx series, but it kept up and in many cases passed up the G80/G90 chips, which is what it was designed to do.
 
Status
Not open for further replies.