Apple Ditches Nvidia, Goes ATI-only for Desktops

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
yeah, AMD is the way to go for "high-end" desktops...doesn't apple know that nvidia's cuda technology is wildly important for most apple users, whether or not the user knows it? adobe cs5 and nvidia is a match made in heaven...i gues a 5770 will do the trick, but how bout some 460 fermi?
 
[citation][nom]Cwize1[/nom]I can't help but wonder if this is some elaborate ploy of ATI and Apple against Nvidia and Adobe.That is, those buying a new Mac Pro won't be able to use the GPU acceleration feature of CS5. This would mean less reason to choose CS5 over Final Cut and therefore less market penetration of Nvidia's CUDA.[/citation]

Since when is Apple a significant market???
 
Can't help but think it's another move against Adobe... With the right (Nvidia) GFX card, CS5 screams. With ATI, it's pretty average... But not if you read ATI's literature of course
 
The people who truly know GPU's know that NVIDIA is the leader over ATI/AMD any day of the week. ATi's drivers have always blown. All they do is take the engineering that NVIDIA creates and make it bigger. More ram, more cooling, more pipelines, etc.

This actually woudl stop be from buying a PowerMac, as the last one I had was actually the 2.5 quad liquid cooled. Oh well, maybe the price will lower on a quadro card now.
 
to the poster about the ips...LED backlighting sucks for color quality...glossy screens suck for color quality...thats why hi-res 27 and 30-inch display are so expensive. IPS and CCFL is the way to go...trust me, a pro photog would never EVER even consider using a glossy screen, even an IPS...do a little research and you will see what im talking about.
 
[citation][nom]back_by_demand[/nom]Could "You are (amusing anecdote) it wrong" be the "Can it play Crysis" of 2010?[/citation]

It already is, my good man...it already is.
 
BTW, the article is wrong in saying that Apple is going exclusively with ATI for desktops. The last couple of models of the mac mini still use NVIDIA graphics.
 
[citation][nom]wymer100[/nom]Apple uses NVIDIA for laptops, and AMD/ATI for desktops. It probably just comes down to thermals and price. Speaking of price, don't discount the displays, particularly on the high-end. The high end system has a 27in LED-backed IPS display. It's a really awesome to see in person. The closest thing I could find on newegg was a 26in ViewSonic IPS display on sale for $900 (regularly $1000). I'm not sure if it was LED backed. If you add the display into the equation, the high-end imac is a pretty good deal.For all those people who will complain about the not being able to use the display when you buy a new computer, well Apple took care of that a while ago. There's a mini-DP on the back that allows for video-in so you can hook up a different computer or use a blu-ray drive.[/citation]

You can now buy the Apple 27inch display separately. You are right though it does make up a lot of Apples price difference. Yet Id feel weird buying one because you spend more on the monitor then the computer and that is just backwards.

I dont think it was a bad decision to drop Nvidia right now too, seeing how a GTX 460's size and power requirements should fit the Imac and it is much faster then a 5770. Imac customers would have been better off if Apple didnt drop Nvidia.
 
I think Apple should produce AMD CPU Macs, INTEL needs competition. But now I found a software that runs only on NVidia cards. so I hope Nvidia to produce their fermi Mac version. More competition, not less.
 
Those interested in the 27 inch display of the iMac could wait for the coming Cinema version(Comming September), or they could get the same panel in the matte screen known as the Dell U2711 at least according to http://www.flatpanelshd.com/review.php?subaction=showfull&id=1265617565 It's probably cheaper then what Apple will be charging, and supports VGA, DVI, HDMI and displayport input(Cinema supports only Mini DisplayPort) but then again the design is different as well.

The Dell U2711 can be found at Amazon for $939.00 + $20.49shipping

As for CUDA I think Apple has a bigger interest in getting OpenCL to be the common standard, after all they started the work on it, if nothing else because it will allow it to continue shifting between suppliers of their graphics. In the end it will be good for us, the consumers, as surely neither ATi/AMD or nVidia would like to be behind on OpenCL performance if that is what's required.

When it comes to why the desktop line didn't just get the GeForce 320, like the mobile versions, I suspect nVidia did something wrong and this is punishment, like the one mentioned in my earlier post that gave nVidia the chance in the first case.
 
[citation][nom]TommySch[/nom]Since when is Apple a significant market???[/citation]
Since when was Apple a significant part of the video production market? Was that your question?
 
Apple switches to lower priced parts but charges the same price. Then it will switch back to higher priced parts, claim they improve performance, and jack up the price. And the Apple users will swallow it, hook, line, and sinker.
 
OK so what is the big deal here? The previous generation had a baseline model that used a Core 2 thus being able to use the 9400IGP/nvidia chipsets. The only reason that there is not a nvidia option anymore is simply because the Core 2's are no longer a processor option. They are all i5's or greater. Talk about making a big deal over nothing.
 
[citation][nom]Antilycus[/nom]The people who truly know GPU's know that NVIDIA is the leader over ATI/AMD any day of the week. ATi's drivers have always blown. All they do is take the engineering that NVIDIA creates and make it bigger. More ram, more cooling, more pipelines, etc. This actually woudl stop be from buying a PowerMac, as the last one I had was actually the 2.5 quad liquid cooled. Oh well, maybe the price will lower on a quadro card now.[/citation]

Nvidia sucks. That's why nvidias stock has dropped 45% this year. I guess you didn't get the memo....
 
Good riddance I truly hate those computers anyway.

+1 ati hardware is good but software sucks...

I hate the fact that the drivers are tied to microsoft framework and it tell you at the end of the installment that it failed....

Puke... sorry i just saw the yucky imac again!!

That's where Nvidia is cool: hassle free installment.
 
Actually this is pretty bad news.

The 320M is the bad ass of integrated graphics, generations ahead of anything ATI or Intel has on the roadmap. But it will be a beast of another age, relegated to 13 inches Macbooks.

I would rather see more C2Ds parts with this amazing IGP than another x86 machine running Mac OsX on common PC hardware.

BTW, only idiots use iMacs and MacPros. Mac OsX simply does not deserve this privilege any more.
 
[citation][nom]Zagen30[/nom]"The ATI Radeon 5770...[is] faster than the top-of-the-line graphics cards in the previous generation".If they're saying that the 5770 is faster than the 4890, then they're dead wrong and lying to their customers. I looked back at some of the performance reviews on this site of the 5770 and it was consistently outperformed by the 4890.[/citation]

It's referring to the last generation of Macpros, not the ATI 48xx generation. Although even then the fastest card would be 4870, still more powerful than 5770. Exept if the drivers are in a much better condition for the new offering.
 
Status
Not open for further replies.

TRENDING THREADS