undocumented problem ....
😱
just curious, is that your justification for everything being ok to take a big dump on AMD? oh, AMD sucks so bad, if anything actually tried to work, its broken, thats why it was done this way. AMD cpus are broken. AMD gpus are broken. anything AMD has is broken. ... AMD would be bankrupt if that was the case from all the lawsuits against them.
Why do you think people were thinking of doing Nvidia + AMD cards in the first place? the people actually doing it weren't idiots who didn't have a clue.
its not broken. its business, business to screw over the competition, ie AMD.
The thing is if AMD even tried some of the stuff that has been pulled on them, they would be sued so fast their head would spin.
http://www.techradar.com/news/computing-components/intel-sues-amd-over-breach-of-agreement-585553?src=rss
AMD is under a microscope all the time while Intel are sitting on mount olympus where no one is allowed to look down upon them. It is encouraged to take a leak on AMD. This mentality is complete bs. Too many people think this way.
Actually AMDs GPUS since the 4000 series and up have been great. The HD2K/3K were meh, although I liked my HD2900 Pro. The HD7K is wayyyy too over priced honestly but then again I got used to $300 dollar high end GPUs.
As for the screw over, nVidia screwed everyone over in many ways. The SLI on their chipset only crap during Core 2 Quad/Phenom was pretty crappy. Then they got poed when they didn't get a new license for Intels chipsets but I understand why not. Intel, and AMD, are moving everything to the CPU. There is not chipset for PCIe anymore, the south bridge is SATA/USB mainly as PCIe is on the CPU die. So it would have been a pointless license in the end.
As for the PhysX crap, I can understand why nVidia did it as they cannot gurantee it works properly with every possible configuration of AMD GPU and one nVidia GPU (be it single Radeon or CFX). Its much like how PC game devs cannot gurantee that upon release any persons PC will not have an issue with said game as its impossible to test every single hardware configuration out there possible.
Still PhysX is over rated. I like the idea but its nothing needed as it doesn't add to gameplay which I find much more important.
looks like intel (like nvidia) gets deep into game development with other companies. amd doesn't do that, usually.
i noticed that in this specific case, starcraft uses havok - owned by intel. it's kind of a no-brainer that intel would push software optimization for their own hardware. say, intel put the compiler fix in havok but forgets to turn it on. i don't think they're really breaking the law or explicitly sabotaging amd. according to court order, intel put in the compiler fix in the software (which is turned off by default) and leaves as it is, since havok belongs to intel.
i've noticed that it almost always turns to games but there are other softwares people use.
Intel leaves it off as they, again, cannot gurantee how it will work or if it will even benefit the AMD CPUs at all. They do have a disclaimer on the website.
Stalker may be optimized for AMD, but was it crippled for Intel systems in the same way SC2 was crippled for AMD?
http://downloadsquad.switched.com/2010/01/04/intel-forced-to-provide-a-compiler-that-isnt-crippled-for-amd-processors/
What software vendor in any industry would cripple their software for 80% of the market. When you make the program run equally on both sets of hardware, guess what? Intel is only ~10% faster.
But no one wants to see that, they want to see massive differences that SC2 show. Ultimately thats Intel's agenda. Ignore all the programs that run similar, and only look at software that Intel pushed, either with their compiler or with the Havok engine.
Stalker and SCII, again as I said before, two completley different game types. RTS performance cannot be compared to a FPS games performance. A FPS does not have nearly as much AI going as a RTS which utilizes CPU power, not GPU power. Thats why the older CPU tests in 3DMark had a ton of robots running around and not intensive graphics.
but yet the fastest amd cpu is 28% slower than the slowest intel system in that test. and 40% on the top end with both cpus being clocked slower.
It is possible. Athlon 64 was quite a bit faster than Pentium 4 at lower speeds. So why couldn't Intel have the same lead in performance considering in the same time frame Intel has introduced 6 new CPUs, all with performance gains over the previous, while AMD has introduced 3 (4 if you count Thuban when it was just a 6 core Deneb), some that didn't have performance gains (or veryy little with Athlon 64 -> Phenom).
So it would seem to me that it is within the relm of possibility that Intel could have a decent performance lead on AMD.
bit tech ... rofl.
Note: the AMD chips were tested in an ATX motherboard, while the Intel LGA1155 chips were tested in a micro-ATX board. This difference can account for up to 20W,
and their overclock ... rofl ... lets see how far we can push power draw, CRANK THE VOLTAGES, ALL OF THEM. lets burn this thing up.
AMDs CPUs have almost always used more voltage than Intel. But still the Intel CPU was clocked higher than the FX and it was probably pushing around 1.40-1.45v which is pretty high for Intels 32nm process.
Still most sites noted the same inefficient design in the FX series when overclocking.
I think you guys are not understanding each others point and are arguing your own points over and over again, lol.
Intel side: AMD does poorly -> benchmarks to prove it -> fact put and accepted (at least by me, lol).
AMD side: Intel does not give Devs/OEMs room to make AMD look better and cripple them in very dubious ways altering fair and square competition -> shows several FTC proof and explain the benchmarks of why AMD is behind by a wide margin -> does not accept (or is it accepted but still pushing the other argument).
That makes this thread go around in circles till a new tid bit of information about Trinity or PD comes around and we start it over again, lol.
Anyway, I still consider BD to be a very expensive side grade for me, but still recognize the boldness of the new design. I still think AMD screwed up trusting Intel in the FMA4 and not PUSHING early development of software that actually supported BD out of the door. Hell, not even their own bloody compiler supported BD fully at launch (not optimized as it should been). That's just dumb.
Cheers!
It does seem to go around and around, doesn't it?
And it is bold. But bold doesn't always mean better. NetBurst was a bold change from the Coppermine Pentium II and much like Bulldozer, it failed to impress until a few generations down the line. Still with even that it was outperformed by Athlon 64 which used less power and performed better at a lower clock speed.
Of course it seems that should stand as impossible by Intel, only AMD can do it.
I say BD is fine, its just nothing amazing to get super hyped about. PD may be or it may not be. Hard to say. Still it wont be up against IB very long. It will meet Haswell and if Intel can get their 22nm in line by then (I am sure they will) and the changes they make are good enough, PD may just be Phenom II. It catches up only to get left behind agai.n