CUDA, PhysX Are Doomed Says AMD's Roy Taylor

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Of course he is. However, I was not the one stating that "Roy Taylor simply stated facts". Opinions are not facts. You also have to realize that he was interviewed as an AMD representative. I'm sure his "opinions" were a lot different when he worked for nVidia, 3 or so years ago.
 




What? Like the ones in this interview?

http://www.eurogamer.net/articles/nvidias-roy-taylor-interview
 

Thank you mousemonkey. One does not even need read too far into the interview and the tune of the fiddle is different...like a good politician. Morals and ethics...what morals and ethics? It's OK, he just roots for the one that pays the bills.
 

Sadly calling someone a fanboy just shows that you have failed to make any additional arguments, and now have to resort to ad hominem, since you have absolutely no reason and evidence to show otherwise. Additionally, from my personal experience, people that end sentences with "LOL" are in actual fact never "Laughing Out Loud" and instead doing the exact opposite.

To attach ones ego or identity to a particular product brand name, political concept or a cultural ideal is just silly, and it's like saying: I root/practice/believe in "X", I'll be magically validated to feel better about myself and my actions, through legitimizing the process of "X".
 
^^ Intel is another big one. I think AMD is generally more "consumer friendly" atleast for a major company.

Lets not forget EA and their crappy DRM either.

Greedy immoral business practices actually lose more customers then they gain.
 
And Apple, Foxconn, Google (to a lessor extent, but 'don't be evil' is pretty shot). That's before you even get outside the tech world and start talking about Monsanto, various pharmaceuticals, military contractors, banks, lobbyists etc.

Then there's this, which really irritates me, because it assumes you're up to no good if you want to do something also done by hackers:
The Media Access Control (MAC) address is hard-coded on Intel wireless adapters and cannot be changed.

Some third-party software applications can "spoof" a MAC address to a different address, but for security reasons, Intel does not support this practice.

Beginning with 12.x wireless driver package, the possibility of "spoofing" the MAC address was blocked to prevent this practice.

Let's ban kitchen knives because terrorists use them to hijack planes. Real patriots buy microwaveable meals.

[/rant]
 
This is just an opinion of a single AMD employee.

PhysX is great and always will be. I hope Nvidia will just ignore this kind of pity assaults and take it to the next level by integrating PhysX support in even more games.
 
Nvidia has provided me with years of breathtaking gameplay because of Physics effects. I play action games, FPS, etc. The little touches like the walls crumbling when you shot, and cloth being punctured as it syays in the breeze are absolutely worth buying nvidia.

Besides that. I am a CADD/CG designer. CUDA technology saves me money every single day. Photo-Realistic renderings and videos are immensly faster with CUDA. My all-time favorite rendering software, BLENDER, will only use CUDA GPUs. ATI cards leave the user no choice other than compute renders WITH THE CPU. CUDA cut a rendering's time from over a day to a little over an hour. Even Autodesk Products give me rendering issues using my ATI machines.

Game developers perfer Nvidia for these reasons. Thats why game experience is better with Nvidia. OpenCL is great, for Linux applications. I still have to use CUDA in linux to render anything without pulling out my hair.

ATI is a great price and it plays games fast as hell, although sometimes details missing.
I love the idea of the APU's and I will buy them. However for my Architectural Viz business...Nvidia is saving me loads of time and making me money.
 
Nvidia has provided me with years of breathtaking gameplay because of Physics effects. I play action games, FPS, etc. The little touches like the walls crumbling when you shot, and cloth being punctured as it syays in the breeze are absolutely worth buying nvidia.

Besides that. I am a CADD/CG designer. CUDA technology saves me money every single day. Photo-Realistic renderings and videos are immensly faster with CUDA. My all-time favorite rendering software, BLENDER, will only use CUDA GPUs. ATI cards leave the user no choice other than compute renders WITH THE CPU. CUDA cut a rendering's time from over a day to a little over an hour. Even Autodesk Products give me rendering issues using my ATI machines.

Game developers perfer Nvidia for these reasons. Thats why game experience is better with Nvidia. OpenCL is great, for Linux applications. I still have to use CUDA in linux to render anything without pulling out my hair.

ATI is a great price and it plays games fast as hell, although sometimes details missing.
I love the idea of the APU's and I will buy them. However for my Architectural Viz business...Nvidia is saving me loads of time and making me money.
 

Do keep in mind all of those effects could easily be done in the CPU or with something like OpenCL with a decently written game engine instead of being PAID by NVidia to incorporate their proprietary junk.

I will admit that NVidia currently has the best bang for the buck with their 770/760 right now and I will generally recommend those in that price range but I don't give things like Physx any consideration.

 
Sounds like a guy who would rather be an Nvidia sales rep to me.... ". . . our competitor who's obsessed with releasing. . ."

They manage to make a better product and release other products at the same time, unlike AMD who releases inferior CPUs and GPUs, this guy sounds like he would rather sell Intel or Nvidia.

*Borderlands 2 looks amazing with Physx.... just sayin'
 


F@H is running OpenCL now and I'm not seeing this big advantage that GCN is supposed to have, can you explain why?
 
29-OpenCL-FAH-Explicit-Solvent-FP64.png

30-OpenCL-FAH-Implicit-Solvent-FP64.png

Yeah, so I'm sort of picking and choosing benchmarks. But in double precision tasks it generally destroys Kepler, with the occasional exception of the Titan (or last-gen 580). Single precision it's still ahead:
27-OpenCL-FAH-Explicit-Solvent-FP32.png

28-OpenCL-FAH-Implicit-Solvent-FP32.png
 


Those canned benchmarks don't seem to reflect real world performance IMHO which is why I was calling for a different way over in the folding thread a little while ago.

I'd like to see some numbers from real folders doing actual WU's, anybody out there folding with 680/780's and 7970's?
 
I finally signed up for an account just to respond to this. I expect this sort of trash talk from a pasty faced dungeon dweller living in mommy's basement--possessed of all the social acumen that entails. What I don't expect is for professionals to act this way. George Bush got up and declared the war over... and we see how that turned out. AMD is just trying to cash in on the weakness that Nvidia has shown; whether that's good or bad, it still feels like an immature gesture--like a monkey shaking his flaming ass cheeks at the opposition. And it's just about as attractive, too.

As for the claim that Physx is unfair to AMD users, uhm.... if you buy a Ford, you don't get to tell the Chevy drivers what their manufacturer can do for their customers. If you want Physx, buy Nvidia. If you want AMD, buy that. Take this fanboy BS somewhere else. It has no place among grown men.
 


And now you've got that off your chest, welcome to Tom's! :lol:
 
Status
Not open for further replies.