Nvidia GeForce GTX 650 And 660 Review: Kepler At $110 And $230

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


I hope that's a joke. You didn't just suggest a Ferrari or Lamborghini is for 65mph did you? No matter, I think you missed the point over semantics if so.
 


My point was that the factory overclocked cars aren't so overclocked that they run several times faster than the reference models and have to deal with greatly increased stress as a result. It's not a joke and I didn't mean to imply anything such as the such cars are built for ~65MPH. It's not just semantics, this is being more realistic and not over-exaggerating the issue.
 

I was gonna suggest leaving the door open to your room (even though you may have some privacy concerns), but it seems like you already do. It's a little peculiar since it doesn't get bothersome when I close the door to the room while playing and my brother doesn't seem to have a problem with it (taking into account that we have two HD 7850's running, though our CPU is just an i3-2120). The ceiling fan is on often though so that it doesn't get stuffy, and we don't really let too much sunlight in through the blinds, just enough for comfortable monitor viewing.

This may sound stupid of me, but there is an air-con vent in your room right? Maybe some cleaning is in order (the filter for example), though you might already clean it regularly. Maybe having cool drink while gaming may be a good idea, though that may involve having to replenish it pretty often and not to mention having to "empty" pretty often as well. :lol: Opening a window, in AZ, at this time of year...out of the question. Haha! But come the cooler months (and they're coming pretty soon 😉), it would be a good idea, since it gets pretty cold...

Downclocking and undervolting your CPU and GPU (if possible) are good things to do, even if heat isn't an issue I think... :)

 
28 fps is unplayable lol, i thought somewhere around 15 fps and lower is unplayable.
 


OK, I let it go when you said it to me, but misrepresenting the facts repeatedly is now grating on me a bit :) You're now blanketing AA in with msaa, both of which is quite easily refuted. You even went so far in your comment to me to include the entire 600 series. So I feel I'm free to compare them all then and should not be able to show any AA working fine. Please accept apologies for the coming wall people, but kind of hard to compare crap without quoting the heck out of people etc...It must be done :)

http://hardocp.com/article/2012/08/27/nvidia_geforce_gtx_660_ti_at_high_aa_settings_review/6
Page 6/7 are the pertinent ones for 660TI vs. 7950Boost. Note pretty much NOTHING was playable at 2560x1600 which is NOT what these cards are designed for (it takes a 30in to have this res and even then they are 2560x1440 at newegg). Note ALL 24in & below at newegg (68 monitors!) are 1920x1200 or less with the majority being 1920x1080.
You said MSAA tanks on NV (said all 600 series do, we'll get to 680 in a minute). All of these tests were done at 4x MSAA 16AF.
Max Payne3 14 vs. 14 minimum's ...NOT PLAYABLE either way. So not really much point in discussing it. But a tie.
Dropped it to 2x MSAA still unplayable on both as they note, but look at that...22min 660TI vs 21 min 7950B. A loss? What happened to the tanking 600 series?
Dropped to 1080P and raised to 8x MSAA+FXAA+16AF...This should make your case correct? I mean those 600 series suck here.
15min 660ti vs...wait for it...12min 7950B...I thought you said they tank? Even max is 26.9 vs. 23.9. Total win at 8x for 660TI?
"The HD 7950 has 240GB/sec of bandwidth on a 384-bit memory bus while the GALAXY 660 Ti has 144GB/sec of bandwidth on a 192-bit memory bus, yet the GALAXY GTX 660 Ti is faster at 8X MSAA at 1080p here! What does this mean? That memory bandwidth and the width of the memory bus is not everything when it comes to gaming performance, other factors are involved.
Important FACT: Neither video card was playable in this game at 1080p with 8X MSAA. The lower width and lower memory bandwidth card was faster than the higher memory width and memory bandwidth card. "
Not playable here either...We already see how memory isn't an issue now I don't expect this to change dropping to 4x which will relieve the 660TI more and make this worse.
Kind of refutes your claim totally...

Moving along...
Battlefield 3 same story @ 2560x1600 4xMSAA etc...NOT playable on either side. 21 vs. 23.
"Once again, the HD 7950 has 66% more memory bandwidth, yet here you go, the same performance in actual gaming between the video cards."
You seeing the pattern yet?
Dropping down to 1920x1080 and 4x MSAA FXAA 16AXAF again
31 vs. 34 (finally something playable) but they're quoting the max which is 54.6 vs. 51.8 and state:
"Oddly enough, the tables turn at this lower resolution again and the GALAXY GTX 660 Ti GC video card is 5.4% faster than the Radeon HD 7950 w/Boost. The lesser memory bandwidth and memory width card is faster than the higher memory bandwidth and memory width card. "
Hmm...Either way min or max, this is a wash no? I claim min is more important but whatever...It's a wash.
http://hardocp.com/article/2012/08/27/nvidia_geforce_gtx_660_ti_at_high_aa_settings_review/7
Moving along...BATMAN AC same story @ 2560x1600 8xMSAA etc...NOT playable on either side.
What the heck? 16 vs. 14 mins...7950 LOSS again (though again useless, not playable...but you get the point, it loses and no tank for 660!)
They also mention the weakness of the 7950 tessellation as it "suffers".
Dropping to 4x MSAA etc. samestory 17vs. 17...TIED, but 660TI manages victory on avg 36.2 vs. 34.9 and look how long it stays low on the radeon in the tessellation again...LOTS of time in the lower fps...Lots of time below 25fps! OUCH.
"Once again, lesser width and lower bandwidth than the Radeon HD 7950, but yet higher game performance. "

You have to be seeing a pattern here correct? You are misrepresenting the facts no?
Dropping down to 1920x1080 and 8x MSAA FXAA 16AXAF again
28 vs. 22 min...HOLY BATMAN....umm...7950B TANKED. and they comment on that being low alot on the radeon:
"Important FACT: Both video cards are playable at this setting, but the HD 7950 does lag a bit during tessellated scenes."
Surely you see the pattern now right?
Skyrim FINALLY a decisive blow for 7950B:
"At 2560x1600 with 8X MSAA the HD 7950 w/Boost is 18.7% faster than the GALAXY GTX 660 Ti GC. This is the largest difference we have seen yet."
But I wouldn't play there. This is riding the line of unplayability still on either. You will dip below 30 on both in taxing scenes with lots of crap going on...so down we go & 4x.
Dropping down to 1920x1080 and 4x MSAA FXAA 16AXAF again
"The Radeon HD 7950 has a 384-bit memory bus with 240GB/sec of memory bandwidth, while the GALAXY GTX 660 Ti GC has a 192-bit bus with 144GB/sec of memory bandwidth, yet we sometimes see the GALAXY GTX 660 Ti GC video card performing faster, but the 660 Ti does have a GPU clock advantage. We don't see that 66% advantage in memory bandwidth the Radeon HD 7950 has in real-world gaming with high setting AA configurations. This means there are other factors besides the width of the memory bus and the bandwidth that affect performance between these two cards; namely GPU clock and architecture advantages."
Important take away here...Again, your memory issue, and tanking MSAA (you said all AA to the other poster, lumping it all in...LOL) is completely false.
So at a res/settings where we're finally playable, and where your 7950B finally does something good, both cards are plenty playable :) Never dropping below 50fps.
This sums it up best on the next page:
"In the end this all reinforces our stance that memory bandwidth isn't everything, and people seeking out video cards for gaming should not focus so intently on the memory bus width and bandwidth specification for determining their video card for gaming. Other factors go into it, and only through actually gameplay will you know how these truly perform side-by-side. That is why we here at [H]ardOCP actually play games with these video cards and use that real-world gaming performance to determine which card your money is better spent on.

Benchmarks and the like cannot tell you this real-world information, and can be extremely misleading. Take for example a test that stresses memory bandwidth and fills it to the brim, sure, the 7950 would win that test, but that test doesn't translate to what we just found out in real-world gaming. Performances were close between these cards, and in some cases faster on the card with the lesser bus and bandwidth. So focus on what real-world gaming tells you.

Our testing today at super high AA settings has shown that the "bus limited" GeForce GTX 660 Ti does not crumble so easily. There were no instances where we ran into "VRAM wall"s that took gaming performance into single digits, this never happened even at high AA settings with Transparency AA."

Also note all the 660TI's at $300 come with Borderlands 2! :) Also note the power difference with 7950B vs. 660TI.
Note the temps and noise too at anandtech (hardocp wasn't testing that here, just super high AA stuff)
http://www.anandtech.com/show/6159/the-geforce-gtx-660-ti-review/19
315w for Zotac, vs. 373w for 7950B! That won't help my hot room. This should be expected at 1.25v 7950 vs .9725 660ti...Can't get around heat with that much core V.
You can have the boost, it's running 9C hotter (68 vs. 77, and that's the highest clocked zotac, so you're not going to look too good here in my room).
If you remember, I'm basing my buying decision on temps/noise as I live in AZ and already commented on the hot room issues. But the MSAA was just proven incorrect anyway. So I'm still correct in my decision with or without the hot room being my buying decision.
Noise? Same page at anand - 49.2 vs. 58! That's 9 DB's and those go up exponentially so that really sucks. Zotac is the worst 660TI in the review so not good for 7950B.

680 GTX sucks at MSAA too you say? Heck you said all so I could throw in 690 here too...
http://hardocp.com/article/2012/08/29/gigabyte_gtx_680_super_overclocked_edition_review/
"We will be testing it against the GeForce GTX 680, Radeon HD 7970 GHz Edition, and the highly overclocked MSI GTX 680 Lightning."
Note temps/watts 7970ghz edition loses to all cards. It's the worst. It goes without saying the hotter you are, the more noisy.
Battlefield Multiplayer comment "The AMD Radeon HD 7970 GHz Edition was only playable after lowering the resolution to 1920x1200 with 2X MSAA, FXAA High, and HBAO enabled, and averaged 63 FPS."
Slower and Nv had physx turned on to max! They had to TURN DOWN MSAA on the radeon without PHYSX!
Batman AC? "The NVIDIA GeForce GTX 680 averaged 47.6 FPS with these graphics options. The AMD Radeon HD 7970 GHz Edition also averaged 47.6 FPS but with no level of PhysX enabled. " the oc 680's "The NVIDIA video cards took the cake in this game, proving playable at 2560x1600 with FXAA High and PhysX High enabled."
Decimated as all NV even ref had physx (which tied 7970ghz basically, the others wiped floor with 7970ghz)

Max payne3? AGain was vs. ref680, while the others blew 7970ghz away.
Witcher 2? "The AMD Radeon HD 7970 averaged 41.9 FPS using these settings and the NVIDIA GeForce GTX 680 averaged 42.2 FPS. " another loss even vs ref 680, beaten by OC cards of course. So in all cases even vs ref, at best you wash and have no physx, and at worse you just lose and maybe even had to turn down MSAA. Kind of refutes your words.

You can read the rest yourself.
Does it help if you have 2 of 7970ghz vs. 2x680 in SLI? Change much with two of each?
http://hardocp.com/article/2012/05/29/galaxy_geforce_gtx_680_gc_sli_video_card_review/
"These cards are a beast in SLI, providing us the best performance possible at 5760x1200. There is no question these also beat Radeon HD 7970 CrossFireX to the punch at every turn. There is even more room to overclock as we found out, getting in that realm of upper 1.2GHz operation in games. "
and more, important for me if I was actually rich...ROFL:
"These are sleek, run cool, and are the fastest thing on the planet right now."
AGain, even in sli/cfx, it's a no brainer on the NV side.
Not enough? Sapphire 7970@1280! vs. MSI GTX 680 Lightning:
http://hardocp.com/article/2012/07/30/msi_geforce_gtx_680_lightning_overclocking_redux/5
maxp3 "GeForce GTX 680 Lightning had the clear performance advantage over the Overclocked AMD Radeon HD 7970"
BF3single (who plays singleplayer?)=wash "The extra performance compared to the Overclocked AMD Radeon HD 7970 did not make much of a difference in game. "
Batman=OUCH AMD decimated by 680 "This is 16.2% faster than the stock MSI GeForce GTX 680 Lightning, and 19.3% faster than the Overclocked AMD Radeon HD 7970. The performance increase was consistent during the entire run-through. It provided a significantly better gameplay experience with smoother framerates during fighting and while climbing buildings. "
Witcher 2=wash (21 vs. 22 or 51.5 vs. 50.3, wash as stated)
BF3 multiplayer 4xmsaa again = OUCH AMD 59fps for 680 and only 50fps for
So let me get this straight...We crank things up in Multiplayer 4xmsaa and the 7970 even at 1280mhz gets killed? What happened to the tanking 600 series? Looks better for 680 all around here or wash/tie. Don't forget physx either.
"we strongly feel this video card deserves our HardOCP Enthusiast Gold Award." & worse statement "The video card also one of the fastest out-of-box operating speeds. It even went head to head with one of our fastest overclocked AMD Radeon HD 7970's and swept the floor with it."

Sorry, no love for Radeon this time either. Swept the floor with it...That's a pretty strong statement about a series of cards you say you wouldn't touch. I feel hardocp is correct in using actual gameplay as it is true gaming rather than just a dumb benchmark that isn't reality. "and provided noticeably better performance in game." As noted above the actual gameplay is what they key on. I'd rather have them key on the minimums while doing it, but either way the story comes out the same. MSAA doesn't tank on ANY of the 600 series. Not sure how toms got that to work that way as there is a body of evidence against you.
Anandtech different? 1920x1200 scores rundown for you (because I had fun taking apart Ryan's 660TI Zotac review last month...ROFL, check out comments section in it pages 27/28/29 if memory serves where I destroyed him and his lackey Jarred Walton politely until they ended up calling me an Ahole and that I was uninformed..ROFL, no argument with my data though, I was rather shocked they resorted to personal attacks...LOL).
Note From anandtech 660TI review, check out the 4xmsaa/16af Skyrim. Note the all 660TI (including ref) beating 7870/7970 & 7950Boost at 1920x1200. No MSAA drop there.
It got a little nasty so understand this is NOT aimed at you BLAZORTHON...ROFL - it's from my response to Ryan's response to me picking apart his OWN WORDS on page 27 of the comments section (before Jarred got personal about it...hehe - I'm on there as TheJIAN, it's my hand at most places I had issues registering here i think because of ancient account from tom pabst days blocking me, LOL):
"while the 7950 is anywhere between a bit faster to a bit slower depending on what benchmarks you favor." HERE WE GO AGAIN PEOPLE: Follow along:
Civ5 <5% slower
Skyrim >7% faster
Battlefield3 >25% faster (above 40% or so in FXAA High)
Portal 2 >54% faster (same in 2560x...even though it's useless IMHO)
Batman Arkham >6% faster
Shogun 2 >25% faster
Dirt3 >6% faster
Metro 2033 =WASH (ztac 51.5 vs. 7950 51...margin of error..LOL)
Crysis Warhead =WASH (ref 7950 (66.9) lost to ref 660 (67.1), and 7950B 73.1 vs 72.5/70.9/70.2fps for other 3 660's) this is a WASH either way.
STARCRAFT 2: 7950 (88.2fps) VS. GTX670 (121.2fps) @1920x1200
So another roughly 37% victory for 660TI extrapolated? I'll give you 5 frames for the Boost version...Which still makes it ~30% faster in Starcraft 2.
So vs. the 7950B which you MADE UP YOUR MIND ON, here's your quote:
"If we had to pick something, on a pure performance-per-dollar basis the 7950 looks good both now and in the future"

we have victories of 25% (bf3), 54%(P2), 7%(skyrim), 25% (shog2) 30%+ (sc2), 6% (dirt3)
1 loss at <5% in CIV5 and the rest washes (less than 3%) ...But YOU think people should buy the card that gets it's but kicked or is a straight up wash. You said your recommendations was based on 1920x1200...NOT 2560x1600...Well, suck it up, this is the truth here in your OWN benchmarks...Yet you've ignored them and LIED. Let me quote you from ABOVE again lest you MISSED your own words:
"And 1920x1200 is what we based our concluding recommendations on. "
Then explain to me how you can have a card that wins in ONE game at <5% and BEATEN in 6 games by OVER 6%, with 4 of those 6 games BEATEN by >25%, and yet still come up with this ridiculous statement (again your conclusion):
"On the other hand due to the constant flip-flopping of the GTX 660 Ti and 7950 on our benchmarks there is no sure-fire recommendation to hand down there. If we had to pick something, on a pure performance-per-dollar basis the 7950 looks good both now and in the future"
Are you smoking crack or just BIASED? Being paid by AMD? Well? The 7950 is NOT cheaper than the 660TI. Can you explain your math sir? I'm confused. What "flip-flopping of the GTX 660 Ti and 7950"?? You're making these statements on 1920x1200 right? OR should I quote you again??...LOL."
It's really funny when read in whole. Ryan tried to defend his position of testing at 2560x1600 and complaining about memory (kind of your comments) and then saying people are buying IPS monitors so this is important @ $400 from Korea...LOL. I had a field day with EBAYing monitors from Korea...I wouldn't give my credit card to some dude in korea with a Gmail address...LOL. They really should have just shut up, but it is quite humorous to watch their arguments (then personal attacks) have a meltdown. The best part is I use their own article for 95% of my comments. It was ALL their own data... :sol: They didn't respond again once I called Jarred out on attacking me while saying NOTHING about the #'s.

Radeons are great cards (heck both companies make stuff anyone would be happy with), but you're incorrect on AA and MSAA as shown at both sites whenever MSAA was used. Also note, rather than waste your time, I will tell you I believe if I looked hard enough, I could find a few the other way. But that doesn't negate the physx, temps or noise. Which still leads to my desiring a 660TI unless AMD really makes me a sweet deal on Black Friday (real sweet, but I could go that way...Just not likely due to heat).

Another here showing batman dipping to 10fps while 660ti 28fps mins.
http://hardocp.com/article/2012/08/21/galaxy_geforce_gtx_660_ti_gc_3gb_overclocking_review/4
They jumped all around as they try to usually show the highest you can get away with, but no drops in MSAA here either.
As noted at anand, the 7970 is a cherry picked chip (ghz is too), so it should perform slight better than normal chips (and far better than 7950boost as they are 2nd hand chips).
http://www.guru3d.com/article/radeon-hd-7950-overclock-guide/3
It went from 138w (already over the Zotac OOBE OC’d at anand etc) gpu watts to 217. That’s not going to keep my room cool this was only 1150mhz (not near your 50% stated, as the default is 800 here, which I’d say is pretty tough to do anyway, 40% ok, 50 rare, and only if you like heat and watts) and it took 1.25v to get there.

More evidence it's not so easy to hit your 50% overclock even on 7970's - have trouble hitting 1200!:
http://www.behardware.com/articles/853-18/roundup-the-radeon-hd-7970s-and-7950s-from-asus-his-msi-powercolor-sapphire-and-xfx.html
Look at the charts, note the bottom one. Only TWO cards hit 1200. Three of the 7970’s couldn’t even hit 1200! Look at the chart above that line...1.20 is REQUIRED in most cases to hit anything over 1100. That is the FIRST voltage where they all meet 1100. @1.174 two couldn't even hit 1100 (the HIS and the Reference card). ONLY 3 cards hit 1125@1.25, only ONE went above. That’s 11 cards, and 4 of them 7970’s. Not as easy as you say and they’ll get HOT/Noisy. Note the reviewers comments on pg 19:
"When the GPU voltage is changed this goes up to an increase of between 21 and 78%, which is enough to put the power stages of these cards under stress."
OUCH. That’s a lot more power draw to do what you say correct? Most had a tough time and didn’t get there. Note 7970’s are CHERRY PICKED.
Also note his page 14 comments and the charts showing heat stuck inside the pc frying other components as they don't expel the heat out the rear well:"The reference HIS, MSI Twin Frozr III and Sapphire OverClock Radeon HD 7950s and Radeon HD 7970 Lightning however tend to direct more hot air towards the hard drives.". Only two cards didn't do that. Another downer for the 7950 cards in my mind. The CPU in your case (your OC card) would be 5C higher as shown in the chart vs. reference model. That's a lot of C added to your CPU.

In closing, I apologize to the rest of you for the wall...But after looking at the data (a month ago during anandtech mini war...LOL) it's hard to let your statement fly. Schools out :)
Unless a radical price changes the story at Black Friday, for me I think I'd be foolish to buy anything but the 660TI especially with the heat/noise, and rarely losing at 1920x1200. I don't intend on buying a 27 or 30in shortly and would buy a 2nd one if I did that to not sacrifice perf as shown, mostly unplayable on both 7950/660ti with MSAA etc on @2560x1600. They both hit teens MANY times at hardocp which Is why I tore ryans review apart for making statements as if people try to do that with 1 card. Others agreed and most had SLI for that.
 


Nothing you said is stupid...I welcome any AZ comments as I haven't been here long :) Unfortunately this house is brand new and we just finished building it for June 1st move in (which was late BTW...rofl 2600sqft house in mesa single story). I had the AC guy come out and try to direct air from the bathrooms (2 between me and another bed not used for anything more than a tanning bed...LOL...I know, we didn't buy it in AZ ya know...LOL) and that Tan bedroom. IT lowered it in here about 2 degrees. I think the biggest problem is we have no Zone heating...The actively went against us on this and said nobody does that here. Is that true? We had it in Oregon and Texas. I'm now thinking that is 90% of my problem as I used to be able to drop say Zone3 (with the PC) to much lower than the entire house (2000sqft in oregon, 2650 in TX) and it made a huge difference in both houses. Well, I never complained about my PC, though in oregon it wasn't the same, AMD X2 3800 back then (the smaller version that was excellent heat wise, I was still selling them then so had a great one..heh) and 8800GT if memory serves. The Oregon house was 2 story, though with zone that should have been a non issues I think. The only other thing I can think of (I know my parts are great so not the PC) is the blinds as you mention! The company had to switch us at the last minute and we had nearly see through crap tan ones. I objected once seen in the house and got them replaced with the darkest they had but they are nowhere near what we got in TX (not part of the house bill then so we got whatever brand we picked). In TX they were BLACKOUT blinds and I think this blocked a ton of heat there. I seriously NOTICE the temp change if playing and the sun comes up (it does on this side). In an hour I'm like dang, these blinds suck :) I also miss the blackout blinds for light. I have problems sleeping without DARKNESS :) I hate these friggin things. 5:45 it's light in here...BS :)

I do think I can drop the heat to bearable just by going IVY in a month or two though. I'm looking forward to that and downclocking it to 3ghz (same as my current dual, with much higher IPC I'll still grin about it). I don't think the vid will offer any lower temps over my current (despite the fact it's time to upgrade for me as a gamer...LOL). From what i can pull up on the 5850 old temps vs. today's benches of the 660TI I think it's a wash so no gain there and I want either old 7950 (not Boost 2nd handers) or 660TI. I don't need borderlands 2 with 660TI so it's pretty much about a really good price on an old one (which it seems I may not be able to get in a month or two, and I refuse to buy before Black Friday...LOL), or a 660TI just because I already see it is cooler generally speaking and quiet (which my PC is built to be silent). Yeah, the door is open most of the time, but that makes me lose my Klipsch v2 400's...LOL. I have to use headphones or drive others crazy. I have the old Logitech Z560's too (1000 watts of rocking gunshots...LOL), but those are just out. Shotguns etc will drive your NEIGHBORS insane on those, let alone other people IN the house...ROFL. The klipsch is a better balance and not so booming. But given the opportunity and nobody home, I'd rather game with the Z560's (Z540's too for testing - I owned a pc business as states so I have an excuse for all the hardware crap...)...ROFL. You actually FEEL the hits... :) and your feet vibrate with Z560's...ROFL. But eclectricity wise, the klipsch are Bash amp (still 800watts), and don't draw watts when just sitting there browsing etc. I like that. You can hear a few sounds that don't show up on the Z560's too in the same games. The klipsch are really the best pc speakers I've heard in 15 years. You'll drive yourself out of the room at anything over 300watts on either though.

One comment on your setup. Your cpu is much cooler than mine unfortunately. As a reseller (and OCer) we used to run through everything that came in and I'm talking in volumes just to get the best OCer in our own systems. I miss that :) Now I roll the dice on if I'll get the correct week, costa rica/Malay etc...I miss the trays...LOL. I sold OC'd machines so we had data that made getting the best for ourselves easy. Not anymore & I hate dice :) But yeah, get a higher end cpu and I think you'll be telling me about your room temps :) I'll buy new blinds next year (we're entering winter so I figure, Black Friday cpu/vid/board over blinds for now - I already have 16gb for it). The blinds I think are a good $400-600 for this room so I'd rather upgrade PC in winter and do blinds maybe around March/April just before the heat. :) Thanks for the reply though, good tips, unfortunately I think I've exhausted anything I can do without lowering the PC somehow or getting zoned AC which is way out there for now. Agree on the downclock...no need for the heat here or anywhere unless I think something is slow. I can get my chip to 3.6ghz easily below regular defaults of e8400 (1.25/1.35) so my xeon is pretty dang good :) It's downvolted to 1.07 and runs great at 3ghz there.
Fixed a few spelling errors...I'm tired...ROFL.
 


You're kidding right? A troll just drops in a comment and watches the fire break out. I obviously did a lot of homework as you noted, before coming to my conclusion as shown. I'm not quite sure why you can't appreciate that. You'd rather me just make blanket statements like a lot of people without backing it up at all? Get over the length of the post and realize the data or don't bother commenting about my post. I was attempting to make a buying decision to DROP the temp in my room. That requires a LOT more effort than blindly accepting one site's benches. I had much more data, that was just easy to pull from Anandtech posts where I buried Ryan/Jarred's NON data comments. Which was totally lame on their part...when faced with logical reasons and loads of benchmarks they chose to attack me not the data and went silent when called on it - I wasn't alone pointing that out. I'm surprise Anand himself didn't tell them to stop. Maybe he did :) The instantly shut up.

There is no fanboy crap in that post. I merely laid out in great detail why blazorthon's comment was factually incorrect. I'm sorry my post doesn't lean the way you want, I go where the data points regardless. If you bothered to read it, I said blaze shouldn't waste his time, as I could find stuff for the other side too (I was only pointing out no drop in MSAA in great detail for the most part). I can show either side winning perf (though tougher with AMD, you do have to dig a bit this round), but the temps/noise can't be won for AMD this round (not true of the 5850 which is why I bought then). For the next 10 days it's in the mid 90's here! TODAY it will be 100 on sept 24th! Check weather.com. AS army_ant7 said, a WINDOW opening here is OUT at this time. I'm nonsensical? Seriously? With that alone I know he's living here...ROFL. He is soundly correct. I'm really disappointed someone calls you a troll when you give them tons of data proving a point. Typical democrat response. "well uh, you've laid out a great detailed explanation..umm...You suck". It's sad when fanboy crap gets in the way of great data/discussion. I'm possibly a troll or at the least you are unimpressed...LOL. WOW. As I told ryan at anandtech. The data doesn't lie (but he did repeatedly). It's unfortunate you will read a 15 page review but ignore someone laying out a page or two of data with excellent analysis from various sources with links even. Why are you here? To tell people like me I have personal issues? I came to discuss the data. You'd think you would seek out people like me like I do (even differing viewpoints - it makes me smarter and well rounded). You can't get a real picture of anything off one site and volumes of us can read much more than ONE of us. Forums are for this exactly. A place for people to meet and share data from all over the place so hopefully we all get more educated. Blazorthon was a bit too condescending to the other guy (he needed a lesson he said, when the poster really didn't say much more in his "limited experience" comment - not anything that should draw a "you needed a lesson" crap), and to me, then he went on with his factually incorrect stuff. I felt he needed a lesson himself :) IF people were forced to back up comments with data, the forums would be a MUCH better place.
I own a Radeon 5850 but I'm trolling for Nvidia? My backordered complaints are here under "the jian" 1st comment on page 2. I never did cancel...ROFL. They caved & shipped :)
http://www.amazon.com/Not-in-stock-Argh/forum/Fx120Z2HF6221CS/Tx3KAO4I35F7EFD/1/ref=cm_cd_dp_tp_t?_encoding=UTF8&asin=B002Q4U5EY
These jerks tried to change the part to 585AZNFC to kill our 450 backorders. Shame on amazon. Not our fault they priced it wrong to get orders then it hit @ higher...LOL.

Also, I didn't arrive 3 days ago. I was here back when Tom Pabst, Van Smith etc were big time. Before they kind of sold out (literally in Tom's case to go back to being a DR. if memory serves and buy a Beamer...hehe) and went Intel loving. The sysmark/bapco fiasco (which Van left over) soured me until recently when I added toms back in more just for news than the reviews (got tired of dailytech slants)...I've been around for a very long time my friend (ouch...I winced as I said that...LOL). INTEL owned the land Bapco lived on and even registered their domain and were on INTEL's campus...LOL. NO bias there. You can ignore the data if you want. That is your prerogative sir. But keep your personal comments to yourself please. I'm not a troll NOR dumb enough to open my window in 100 temps in Mesa AZ and expect my room to drop. Good to see you didn't make a comment about the data at least. It kind of validates it. I welcome it though, as I'd always like someone to point me to a way to get temps in check in AZ, if it's worth reading I will read it. Though on these current chips I think I've read every site in the top 15-20. :) Even some obscure ones and even translated a german one (from a guy commenting in anandtech against me post after post- russiansensation...though fanboyish as his posts were that was a good read). He was really reaching there...LOL. Russian, reading German/pointing to their sites in rebuttal, & arguing in English...OK... Makes sense. :) I had to translate his stuff just to comment back...ROFL. Go GOOGLE. :)

Now can we get back to the data? :) Never mind...I have a test to study for...(yeah, the Microsoft treadmill never ends).

 

I don't think blaz really said the GTX 660 Ti tanks absolutely, and more so for the rest of the 600-series tanks.
I'm thinking that you know that games sometimes favor certain cards (a possible influence being a GPU developer aiding in the development of a specific title). So how does one determine which is the better card? Well, one way is to know what software titles (not just games) you'll be using/playing. Other than that, to make a recommendation, all we can do is make a summary of titles that are most commonly played or that are new (since one may get a new GPU to play a new game).
The reviews here on TH acknowledges this as it seems.

Memory bandwidth isn't always the limiting factor, you acknowledged this, but neither is GPU processing throughput. Sometimes it's not even both. It could be the architecture or the drivers (both contributing to the way it works in general) thus the affinity of certain games to either Nvidia or AMD, I think. Now we can also say that any one of those above can hold back performance. Just for the sake of this argument, memory bandwidth. So the first example you gave, Max Payne 3, wasn't memory bottle-necked, but that doesn't mean other titles are, so your data didn't really refute blaz's point "totally" unless I'm not referring to the same point that you are. You also go on to say that he is "completely false" in the article. Sorry if I seem to be too particular with words, but I would suggest refraining from those words (in bold) since they carry a tone of absoluteness, which isn't exactly truthful.

As I remember, the supposed tessellation advantage of the GCN cards over the Kepler cards was not that they have higher frame rates, but that they take less of a hit relative to their frame rate without tessellation. This was shown in benchmarks here a while back, with older drivers on both sides I might add (the article was the GTX 680 release article, unless there was a different comparison article, with the subtitle "Kepler Sends Tahiti on a Vacation" or something like that).

Actually benchmarks can show real-world performance, that's why we do them. Based on what you were saying though I'm guessing you meant to say "synthetic benchmarks." Just a clarification. :)

I'm not sure where blaz said the GTX 680 sucks. I honestly didn't look through the whole thread, but since you may have it fresh in your memory, maybe you could direct me to it.

Though Kepler may or may not beat GCN in gaming, we shouldn't forget about GCN's GPGPU abilities. Though that's another story...

Sorry for seeming to be on the side of team blaz, but I just see somethings (a lot of it personal) here that needed corrections or clarifications. Sometimes, all of us could do the first sentence (and just that one) of what hasten said in his last post. Emotions can run wild and cause us to be well...irrational with our actions. You acknowledge that both cards can both have situational advantages, and I'm thinking that is your main point. :)
(I just also want to note that I trimmed my quote of your comment by a bit, whatever I found irrelevant to my reply.)


Could you quote which post of blaz's was as you do described? Sorry if I didn't see it, but I'll be more sure of what you meant if I ask you directly. :)


I've been here for just almost a year and 1 month, so not too long myself, though I have learned a few things especially from my step-father who's been here for over 20 years I think.
I don't know if "nobody" here (or at least at Mesa) at all does it (I bet some who can afford to implement it can do it if ever). One way to sort of try to achieve the desired effect may be to close the vents to the rooms that don't need air-conditioning and make the air force to the rooms that do, like your PC room, though I'm not sure that would help and if that results in less cool air getting to your thermostat, then your air-con might run longer.
You may find this funny, but my room actually has a curtain in addition to the tan blinds. Hey, it helps keep light out. :lol: In addition to that, we got to replacing the sun-screens of the whole house from the black (which came with the house) to tan in the hopes that the lighter color may reflect more heat away. We tend to open our blinds more now for light in good faith that our electricity bill would still do fine.
The sun is out for long periods here, but it also stays away for longer during the winter months which are coming (you may have noticed that it isn't out as long as before). I'm thinking that the sun stays away during the winter as long as it stays out during summer. You know like how the North and South pole have 6 months of day and 6 months of night.

I'm not sure if I should discourage you about having high hopes that Ivy Bridge would solve your problem, since I don't think temperature goes done the same with power consumption that linearly. One thing I learned is that because of the bad paste (compared to the fluxless solder used with Sandy Bridge) used between the chip and the integrated heatsink, Ivy Bridge tends to heat up more with overclocks (you'll have to look up how it does at the same GHz as Sandy Bridges since I don't quite recall).
One thing about this may be that since less heat is being dissipated, it wouldn't heat up your room as much. :lol: Not sure if that's factual though, especially over the time you use the PC.
If ever we got an i5 Sandy Bridge, I wonder if we really would be feeling substantially hotter in there even with the door closed. One thing as well is that the thermostat here is right outside the room with the PC, so in case that may influence it to turn on more, though the door is usually closed, like I said. Other than that, I'm not sure where the AC feeds the cold air to in the vent system (like if it's closer to that room).
I'm thinking that graphics cards still are the bigger contributor of heat rather than the CPU.

I'll keep in mind that mini review of yours of those sound systems. :)
 


People, don't bother reading this, just skip to the last sentence. It's only meant for blaz, you'll gain nothing from reading this...LOL.

I didn't ask for anything over OOBE, just OOBE but the exaggeration on my part was merely to make the point; accuracy of the downclock matters not. You did miss the point, we're talking about a DOWNCLOCK, not overclocking. If I was asking them to find the max then bench it there, that would be completely different. That would be STRESSING the card.

The two cars mentioned are not overclocked at all at 200mph, but you wouldn't downclock them (190, 65, 55 whatever) and test them there. They know the top speed of a car because they tested it to find out exactly what the top speed is.
http://www.topspeed.com/cars/lamborghini/2011-2012-lamborghini-gallardo-lp-570-4-superleggera-ar83494.html
Top speed 202mph. I didn't know you could overclock a car. Don't they just blow up when they redline? :) You did miss the point, which was merely why test below OOBE? BELOW what they ship at is NOT overclocked. Testing what they ship at is not asking someone to run them "several times faster than the reference models and have to deal with greatly increased stress as a result." You are...I don't know, reverse exaggerating my comment? In both the cars case AND the cards cases.
http://money.cnn.com/2012/02/29/autos/ferrari_fastest/index.htm
Top speed 211mph. Nobody with this car brags to their friend it can do 55, 65, 150 or 180. They tell him it does 211mph! Correct? But that's beside the point.

Neither side is stressing their cards out of the box. I requested they bench out of the box, not reference. If it happened to be shipped lower than ref, I'd ask for that too. If manufacturers found a gpu couldn't get to spec so they shipped them clocked lower I'd expect to see that too, not what you did to get it back to the GPU company spec. You are GREATLY exaggerating my comment. Both of us stated it would only get 10-15% (though much faster on 660ti & 7950, the 650 is limited by NV)...Several times faster? Stop now. Your making yourself look worse than the comment YOU made that I corrected. AMD's can easily hit 1150 (if you read my anandtech comments at their site I say AMD should clock them higher and CHARGE more before they go bankrupt, I'm not looking forward to getting raped by NV or INTC), and NV can easily hit 1250 as most reviews hit ~1300. Heck they do 1111 boost out of the box commonly on their own and that is just the MINIMUM. Unlike AMD, Nvidia's card works in a thermal envelope and what they sell is just the guaranteed lowest it will do, it may go higher on it's own if the card is not running outside the envelope. The articles I pointed to say that, anand/hardocp both discuss it. AMD's will not arbitrarily go above OOBE on their own (and doing it yourself can cause damage as noted by guru3d etc, can't be done on 600's). If I had said you should buy a 200mph car and bench it at 600mph your statement would be correct. The car would be stressed and my statement would be ludicrous. Again, no SHIPPING card is stressed while running at OOB shipped speeds. I may have DOWN-exaggerated, but I definitely didn't OVER-exaggerate.

You're defending NON realistic performance testing (they do NOT ship there). I'm defending the practice of REALISTIC OOB shipping speeds testing (the SHIPPING SPEED that people BUY, the shipping speed they are holding in their hands before running them SLOWER).

It is semantics when your statement doesn't change the point at all (if you had even got it correct to begin with). Whether I say 65 or 200 matters not. The point was people don't want to know performance of underclocked video cards. That doesn't change at 65, 200 or 500mph for cars either. No matter what they're built at, every review tells of the top speed of X model. You already assume it can do BELOW top speed, there's no point in proving that. :)

If I say put a potOto in the soup and you correct me and say put a potAto in the soup it's semantics. It still means the same thing and we both know it. The grammar nazi here wouldn't change the fact that we still need a freaking potato in the soup :) You're wasting my time, for nothing. I hope you're the only guy reading this. Nobody else got any smarter for it and I just lost 15 minutes of my life...ROFL. I love it when people start with comments like "to be fair"...LOL. Just stop yourself next time.
 


Regarding the AC, I guess I've been here longer than you (march 2010) :) Yeah, noticed the sun times, you're not crazy with drapes and blinds (pondering it myself). I already tried the vents trick and it just made the bathrooms/bedroom hot and didn't seem to do much for airflo to the desired room (that was a mild surprise). I tried that after the guy left as what he did only helped a little longer. He went up inside to redirect air in the AC banging around in the attic and dropped it 2deg with a digital therm right in front of me standing in the middle of the room with PC on for testing - but door open - took a while but he didn't mind his next job was going to suck in heat he said so he just waited...LOL. You just can't beat the PC outrunning the AC (much like a fan/heatsink when outgunned by a OC chip it can't handle I guess). The fan over my head runs all day :) Which is cheaper than cranking up the air and ticking everyone else off doing it :) Ivy will solve my problem because it's far more efficient and I'll be running it (as noted) at the same 3ghz as my current dual core. The two extra cores are free in this case as the E8400 specs are 65w (45nm) and the I7-3770K downclocked to 3ghz will come in well below from 77w (22nm). I should be able to drop the voltage substantially when knocking 400mhz off a quad. I can drop my chip/volts to 2.4ghz from 3ghz right now and resolve the issue :) But that kind of sucks in many of the games I play and is noticeable (dropping vid perf is out of the question at 1920x1200 so must live with that regardless). But I'll be happy at the same speed with 3 gens later chip and much higher IPC, much more power efficient in adjusting on the fly also. The I7-3770 is no E8400 :) 8 threads vs. 2 will come in handy, I rar/par a lot. Multipar takes an eternity on anything over 5GB. Repeatedly doing it just sucks, god forbid you do it on a 25GB...LOL. Packing or unpacking too...Sucks both ways compared to 3770 (I have friends doing similar stuff, I get killed...LOL). My wolfdale is practically ancient tech in cpu terms. Without the door closed I can play for a few hours, but the doorway when you enter you feel the change into the room. If I close it you'll notice in 2hrs as it peaks around 3hrs or so and I don't like it then. OF course I have two monitors too, and the 24in dell puts off quite a bit of heat surprisingly (or maybe not surprisingly...LOL). I have a 2407-wfphc, so no ref vs. anything other than my newer 22in which is far less heat. Not sure if Dell's newer 24's put off less now or not, but I assume as with tv's yes. Touching the back, I'd say the dell is a source of my problems but it is beautiful to look at so... :) Newer tech=less heat/watts used etc. I'm pondering putting a 27 next to it to replace the 22 which I suspect won't help me any...LOL. Especially if I go three way with the biggie in the middle. That won't help at all. Most cards I'd say contribute a lot, but my card was bought for the low temps and is very nice in that regard. My psu barely expels warmth also (pcpower&cooling 750 silencer), as expected I'm barely working it.

On to Blaz:
Message edited by blazorthon on 09-14-2012 at 08:26:15 AM
Message edited by blazorthon on 09-14-2012 at 09:46:02 AM
Message edited by blazorthon on 09-14-2012 at 12:59:46 PM
Message edited by blazorthon on 09-14-2012 at 01:49:02 PM
Message edited by blazorthon on 09-14-2012 at 02:49:44 PM
Message edited by blazorthon on 09-14-2012 at 01:47:20 PM
Message edited by blazorthon on 09-15-2012 at 02:12:00 PM
Message edited by blazorthon on 09-15-2012 at 05:19:07 PM
Message edited by blazorthon on 09-15-2012 at 09:53:28 PM
Message edited by blazorthon on 09-16-2012 at 12:03:15 AM
Message edited by blazorthon on 09-20-2012 at 10:12:46 AM
Message edited by blazorthon on 09-20-2012 at 07:34:53 PM
Message edited by blazorthon on 09-22-2012 at 02:28:31 PM
Message edited by blazorthon on 09-22-2012 at 03:04:06 PM
Message edited by blazorthon on 09-22-2012 at 03:13:46 PM

That may be difficult to do when someone is the #1 message editor in the last two pages by 14 messages...That's just the last two, nearly every post he edits...My god man, keep something in print or check your crap before you post. Nobody else has more than two...LOL. At times I hate the ability to edit. I'd rather see a 2nd post correction system. I hate to full quote people, but maybe you have to here. I snipped yours just to keep the page from filling with just your post and my response. I didn't realize I'd have to watch for this, but lets see if anything is left...ROFL. I'll try to get as close as possible with what's left. I didn't read the whole thread either, and don't think I went back more than 3 pages...ROFL. I'm don't have the time really to read every one either. I have a cert to get before the weekend (hopefully), as company comes Friday from CA and will take me out of it for a good week. Not that my memory sucks, but a week out of it and I may have issues and forget crap/retrain/test. I'm not getting any younger, neither is my memory...LOL I've already blown off purchasing Torchlight 2..and that is really painful for me...ROFL. Thanking god Baldur's gate 1 enhanced got delayed or I'd going crazy. :)
Checking further same on page 2. another bunch edited.

Running into other ridiculous posts as I go back though: blazorthon 09-13-2012 at 08:48:20 PM
"The Batman AC results showed x8 MSAA being about 85% efficient with the Radeon 7950 and 7870 at 1080p and about 80% efficient at 2560x1440. "
http://hardocp.com/article/2012/08/27/nvidia_geforce_gtx_660_ti_at_high_aa_settings_review/7
Batman AC 7950Boost, scoring max 64, min 17. That's like a 70% drop @ 2560x1600 and only 4xmsaa. Note exact same on 660TI...LOL. Validating jaquith posts regarding the hit being unplayable and not worth quoting MSAA problems even if he was correct (I prove this wrong...NV doesn't tank 660-690 do NOT tank any more than AMD as shown).
drop it down to 1080 (1920x1080 and raise it to 8x...same game, next pic below)
660TI min 28/max avg 46.5, Radeon 22min (ouch) 48.6 avg. MAx for them was 95! So still dropping from 95 to 22min! HOLY COW BATMAN...LOL. It's even worse than jaquith said...He said 46% drop was not worth it. He's actually sugar coating it, jaquith would have been correct saying 75% is not worth it...LOL. Note that in the very game he's picking out the 660TI is winning mins by 6 fps. This is NOT playable on 7950Boost here. Note hardocp's comment.
"Important FACT: Both video cards are playable at this setting, but the HD 7950 does lag a bit during tessellated scenes."
While faster a bit at 95, the dip to min 22 is unplayable and it's down below 30 a lot in the graph. I care more about mins than max, I hate stutter and drops fps, though hardocp makes their statements on the AVG (I like they avoid the max at least, it's pointless compared to mins which make a game unplayable).
Either way read that article, nearly everything is unplayable at 1080 with most below 20fps with MSAA on (with both cards). Also they proved (not sure in this or other link I gave from hardocp) that memory bandwidth had no bearing on 660ti in any game. There was NEVER a game playable on 7950 that wasn't on 660TI also.
Skyrim is there too, tanking on 7950 from 104fps to 55 at 8xmsaa! That's a 50% hit too! And only at 1080P not 2560x1440 as he says 80-85% efficient. Note both are playable with NV hitting mins of 50fps they dropped from 95 to 50 (less than 7950boost dropped % wise).

I could go all day on his statements. Including (as I'm going back digging to find what you req, can't seem to even find the poster's "limited experience" comment he picked on):
blazorthon 09-14-2012 at 08:22:51 AM:
"AMD has plenty of finances to compete."
responding to demonhorde665 post on page2. he goes on to say:
"Sure, AMD is a far smaller and poorer company than Intel, but they are still a multi-billion dollar company and they have enough money to compete in the x86 CPU markets properly"
Blazothon is again patently incorrect. I follow stocks and he has no business talking stocks obviously either.
AMD is billions in debt. Total liabilities are 3.9B. More than the whole freaking company is worth. debt to equity 1.81 as of Q2 2012! Total losses this year -629 million.
http://investing.money.msn.com/investments/stock-price?Symbol=US:AMD
He should do his homework before commenting, preferably learn how to read a balance sheet. They will be out of business at the current rate xmas 2014 (I say bought for a song but same thing, someone will have to bail them out). They have less than 1.5b in the bank and are burning 600mil/year in losses (that's only 2.25yrs before you're totally out of cash). Reality is in the next 4 quarters they may have to offer yet another dilution in their stock via senior notes most likely to pay on the debt which will come due again shortly. PE -4.12. Project Denver and Boulder won't help AMD either. For the first time they'll be competing with NVDA on CPU's in 2013/2014. Should be interesting in 2013/2014.
http://www.brightsideofnews.com/news/2012/9/20/nvidia-project-boulder-revealed-tegras-competitor-hides-in-gpu-group.aspx
http://www.xbitlabs.com/news/cpu/display/20120921010327_Nvidia_Develops_High_Performance_ARM_Based_Boulder_Microprocessor_Report.html
http://investing.money.msn.com/investments/stock-price?symbol=nvda&ocid=qbeb
So NVDA sells 4bil worth of crap and makes 12% on it (473.78mil/TTM) AMD sells 6.38Bil worth of crap and loses 629mil/TTM. AMD market cap 2.45bil vs. NV 8.46B (never mind the intel #'s we know they're great). Important to note that AMD will be getting it from Arm (not just NVDA either), and Intel on all fronts. This will only get bleaker for them. No I do NOT like this. I'm not looking forward to $1000 cpu from Intel again, and $1000 video cards from NV, with only crap cards at $300-500. But we're going there soon.
http://investing.money.msn.com/investments/financial-statements?symbol=US%3aAMD
I'm not sure Blaz realizes AMD has lost 6Billion in 10yrs. Whoever approved the ATI purchase at 3x their value (later marked down 2/3) should be shot, tarred and feathered and never be allowed to work again in tech. They killed this company and cost them their fabs. Putting them forever in debt just to get weaker and weaker year after year.
Note also the dulution of shares on that page. 344mil outstanding from 2002 to 698mil 2011 (worse now). This is why they are $3.50 losing half their value this year alone.
10yr nvidia? Profit of ~3bil over the last 10 years including current 9month's this year (same as AMD I included their 629 loss so far also for first 9 months 2012)
R&D for AMD has went down from 2010 of 1.8Bil to now 1.45bil. Meanwhile NV has went up from 691m to 1.02B. Just like Obama and our debt, waste, no jobs, etc AMD is going in the WRONG direction. I can post numbers all days showing anyone thinking AMD is in a good position right now is on crack. We are comparing a gpu company to cpu/gpu here also. I'm not even going to mention the Intel 4B investment in 14nm etc and go through their numbers. It would make even my statements generous. I still can't believe AMD keeps cutting prices. They are killing themselves thinking (Jen laughs about this, and rightly so) they will price NV to death. They have no debt and you have 3.9B. They have 3.5B in the bank and you're at 1.5B. You can't win this war. Stop now, jen has nearly 2/3 of his wealth in the company and wants the war to stop so he can see his stock appreciate. Why doesn't AMD stop. I'm not saying I want $1000 cards, but for your companies sake quit the freaking cuts! I'm not alone, stock holders say the same! AMD is dumb. They clocked their cards too low, then try to price the enemy to death with them. NV clocked to low too as they all hit 1300 as noted. No reason to run hot with AMD clocking so low though. I kind of get it I guess, but were they both going for ZERO RMA this time? They are nowhere near their limits even on OC cards. Either way, if Blaz bothered to read a balance sheet he would NEVER have said they are able to compete financially. That is completely false. I'm not even listing all the tech info here, as only stock people would get the nitty gritty details, but these are the easiest for most to follow. You start talking EBITDA, book value, P/CF P/S 5yr P/E ratios etc to most just look at you like you're on crack.
http://www.fool.com/investing/general/2012/09/18/ask-a-fool-is-there-any-hope-left-for-amd.aspx
Watch the video...
http://www.forbes.com/sites/briancaulfield/2012/07/19/uh-oh-one-of-the-chip-designers-who-almost-got-amd-into-macbook-air-joins-apple/?utm_campaign=20120720&utm_medium=rss&utm_source=topic-tech&partner=obtech
I don't care much about ceo's really, but losing people like Dirk Meyer (DEC/Athlon designer) and Bruno (lano etc) is a very bad thing. You lose your brains you're stuck with marketing and AMD has always sucked at that and has no money to fight an ad war anyway. Anyone believing AMD isn't in trouble hasn't read their financials.
http://www.techpowerup.com/155920/AMD-To-Give-Up-Competing-With-Intel-On-x86-CPU-Prices-Already-Shooting-Up.html
These comments were reported everywhere. Ouch.
They are nearing the end of Dirk's 5yr roadmap and losing engineers. That's not good for the next 5yrs once Dirk's pipe is gone (Bullsnozer is about the end). They stopped listening to him when he had to fill CEO job, hector killed these guys all around just like motorola. He was a LOSER. Bring back Jerry Sanders dang it :) Again an ENGINEER! When they left listening to Jerry/Dirk they went into the toilet as their roadmaps come/came to an end.

Found the lesson needed post...Geez, i'm quickly running out of time here. "blazorthon 09-13-2012 at 01:06:15 PM"
"a clear sign of a lesson being needed if you want to be accurate about what you're saying"
I think I pretty clearly laid out his inaccuracy regarding the MSAA (AND FINANCING TOO), as both take a hit and usually unplayable with 660/7950 @1080p/2560x (1600p). But just in case people can't be bothered to go READ the site:
http://hardocp.com/image.html?image=MTM0NjA2MDA5M09tVGRTMlE0eDNfN18zX2wuZ2lm
http://hardocp.com/images/articles/1346060093OmTdS2Q4x3_7_5.gif
80% DROP! from min to max in batman AC and 50% in skyrim for 7950...He's yapping about accuracy? Dissing me for it too and correcting me on car analogies?...ROFL. His statement is completely incorrect...It's not 85% efficient, it's the exact opposite precisely where he said in Batman AC. It's 80% inefficient. Going from 95max to 22min (and staying there a lot in that chart) is unplayable 1st, unacceptable loss 2nd, and to boot they lost on min 28fps 660ti here which is still playable. Well not to me, I say 30+ or you will see stutter, this is a small snapshot not 3hrs worth of gameplay, you'll see lower than the 28fps and 22fps Kyle shows here. Batman AC 2560x1600 fairs no better dropping from 56max to 14fps on 7950BOOST. Again, just as bad and exact opposite of what he said to jaquith. And jaquith was being overly generous IMHO based on tests outside Toms as shown.
http://hardocp.com/images/articles/1346060093OmTdS2Q4x3_6_6.gif
Battlefield 3, again 93fps max drops to 34fps on 7950boost. This at only 4xmsaa, not 8x which blaz claimed was 80-85% efficient. That's a huge drop.
"Important FACT: Both video cards are playable at this setting, these have similar performance and gaming is great at this setting."
The first game you can actually play on both in that article. I'm not saying AMD sucks here, just pointing out his is factually incorrect, and blatantly lying base on how far off he is. I would have NO issue putting that in bold...LOL. Max payne 3:
http://hardocp.com/images/articles/1346060093OmTdS2Q4x3_6_4.gif
I linked to all this before, I'm repeating myself now. 37fps max drops to 12fps and this at 1920x1080 (1080P!). Note NVDA wins at 15fps max, but ridiculous to quote this crap if I claimed victory for nv here as they are both completely and unequivocally unplayable. Note kyles quote again:
"Important FACT: Neither video card was playable in this game at 1080p with 8X MSAA. The lower width and lower memory bandwidth card was FASTER than the higher memory width and memory bandwidth card."
Lower mem 660TI faster...Hmm...strange, blaz not accurate in this either.
blazorthon 09-13-2012 at 01:40:29 PM
"That's a big part of why the 660 Ti and the 660 struggle far more than even the GTX 670/680 with AA."
We're not going to see him saying this anymore are we?
jaquith also mentions other forms of AA, again correct. SSAA is better on NV from what I can see on the web perf wise. He states (after much back and forth)
"I get the MSAA 'thing' (exploit) but that's only one form of AA, so please have a move towards the middle and be objective."
Agreed...And if the hit is too big quit quoting it as anything useful to know. Finding out X card sucks in a position the card would never be played in is pointless. That's the beauty of a great design well balanced where you WILL play vs. quoting stuff a company wisely avoided optimizing for since you'd NEVER run at <30fps (12? 15? 17? ouch) without hating your PC. This is all discounting benefits of PHYSX regarding fun/looks too which can be turned on for free as shown previously in my wall of text...LOL. Barely breaking double digits would not be worth NV's time (and I flatly proved him wrong anyway) investing to win at a totally unplayable situation. I want to get the best where you'll play and have fun. Jaquith appears to clearly understand this, while blaz desires to quote stuff you shouldn't care about #1 and is factually incorrect #2. I tore Ryan Smith's review apart at anand for this.
jaquith 09-13-2012 at 10:14:20 PM
While not eloquent, and downrated, his points are valid. As shown, jaquith was actually further correct than he states based on hardocp, anand etc regarding MSAA.
"jaquith :
Don your review is a pure hatchet job, it's that simple. Frankly, it's the only one I could find where MSAA was a deliberate con job."
Not sure if I'd call it that, but claiming one sucks (or showing in pics) without showing the other AA modes is suspect. Then people like blaz quote this crap like it's scientific fact.
Cleeve: "But here's the deal: it was translated into multiple languages and read by millions. Literally.
Your trolling is limited to English, on this forum... and most people can't even see it here because it's been voted down so much. 😀"
Nope I read it, and your defense wasn't too good. Hard to argue for showing only ONE AA mode, calling the other guy a troll for pointing it out. I got the same thing from Anandtech for claiming 660ti was for 2560x1600...LOL. Because of some monitors you buy of EBAY from a guy in korea are the new fad everyone uses...ROFLMAO. You can't even call the guy, he had ONE review of the site at amazon (it was NEW) and he had a gmail for returns...ROFL. He has no domain for god's sake. Not defensible, so they called me an Ahole and said I was ill-informed. They didn't say anything about the data thoug...Chock full of benchmarks from everywhere, and the Ryan argument purely used his own articles.

Jaquith finally realized he was banging his head against the wall with this "Since I don't care about any of these GPU's that's all I care to say."...LOL. I believe he was getting a headache...ROFL. He comments on the wild swings affecting his eyes also...I agree again, the charts in the games I showed at hardocp are all over the place. Battlefield pic he posted showed this too...Cleaner line on NV vs. AMD. I can't confirm his statement on Don running AMD and feeling the "AMD loyalty", but his benchmarks kind of make me agree with jaquith knowing I can't find the weakness he shows without it being the exact same or worse on AMD. The numbers above don't lie. Despite Don saying he's running 680 lightning, it is strange. Why not run at least ONE other mode? I already know it's reversed by not willing to claim why he leaves that out. I'm not willing to say Don is bias yet, but I left years ago for other sites because of the nasty stuff they did to Van Smith for pointing out the obvious back then, which was Intel cheated with bapco and AMD was kicking the crap out of Intel. Intel was having recall after recall too. Tom pretty much drove him out (intel ads were up all over too...LOL). Van proved his opinion on his own site (google "van smith bapco" circa 2002). Here, I did it for you all:
http://www.vanshardware.com/articles/2001/august/010814_Intel_SysMark/010814_Intel_SysMark.htm
http://www.vanshardware.com/reviews/2002/08/020822_AthlonXP2600/020822_AthlonXP2600.htm
Repeated excel searches etc, stuff you just don't do a dozen times in succession (much like jaquith is pointing out here, exploiting a perceived weakness of X brand). It was the downfall of tom's site value as anand became the leader estimated at 10mil worth, toms dropped to 1-2mil as people realized the truth. Tom sold shortly after as it was falling apart (good move tom). Gotta get your mil before it hits zero...Can't blame him...I'll read another 10 of Don's articles before I make a new judgement and stay or go again. :).
"September 2006 – Intel's CEO, Paul Otellini, refers to Tom's Hardware as his favorite tech website and quotes the site at the Intel Developer Forum (IDF)"
ROFL. From wiki https://en.wikipedia.org/wiki/Tom's_Hardware
"April 2007 – TG Publishing LLC and Bestofmedia Group merger"
Yep, left round that time...I only recently started reading the news here and consult articles when I've exhausted what I considered better sites first. AMD hate gave way to NV hate from what I've noticed as it seems jaquith has adequately pointed out. They were even removing Van's name from his articles here back then (putting in other names/editor). It was awful & reprehensible.

I stand by my statements and don't mind keeping bold stuff in print for good, this won't be edited :) I am not wrong. I proved he is. The only part he's correct on is this "AMD simply continually makes stupid mistakes". Agreed, that's why they are financially unable to compete now and unless they put out some magical chip that turns profits around (hard to do being eaten at all ends of your products, cpu/gpu on multi fronts, with everyone around them swimming in cash and you being in debt with downgrades just like our country's credit under obama 2 times already! first time since 1917!), they will be going out of business. The arabs won't be magically funding them anymore to keep then in business with the final ties to GF being severed. Investing in them at this point is almost like buying into Solyndra. Someone has to buy them to keep them alive past 2014 and have them remain competitive. My hope is it's Samsung, IBM or NV (if they drop enough). If it's NV I hope they don't pay like AMD did for 3x ati...RETARDED. Jen doesn't seem that dumb. He's highly intelligent and runs his company as if his 2/3 wealth depends on it. He has only made smart purchases well within their reach and not overextending the companies finances. Smart man. DEBT=DUMB. A quick release of A8 5600+ may ease some pain, but it better sell extremely well. NV will have kepler top to bottom by xmas so the cpu's have to help. Its not enough for gpu to kill interest on the debt. The cpu side must start helping their bottom line more. You can't sell 6Bil and lose 600mil and stay afloat long.

I already proved the 680 has no issues and leads almost everything in the previous post even in MSAA (see the radeon 7970 clocked at 1298 comment vs. 680's). I keyed on the 660 here as 680 only makes all my stuff slanted more towards NV. NO point in proving something faster than 660 TI which already has no memory problems as abundantly proven. Kyle repeated it over and over at hardocp. That was the point of his SUPER AA testing. To find out the truth. Also as you alluded to, and I previously state (can't blame you if you didn't read it all), I can find something AMD wins in, though it's more difficult than going the other way. Only one MSAA test at hardocp it won vs. 660TI in super AA test. Same can be said in 680 vs. super clocked 7970@1298 review link I posted. I see no evidence of tessellation advantage on AMD as Kyle points out its kind of a weakness IN game.
http://hardocp.com/article/2012/08/27/nvidia_geforce_gtx_660_ti_at_high_aa_settings_review/8
Conclusion says it all if the benchmarks aren't enough.

Synthetics was not what I'm talking about, but yeah, I'd rather not even see them accept to make sure everything is running ok. Not as a perf statement. I can't game on sysmark. etc :) These types tell us nothing, and at best only show us something we may never do and are easily manipulated (see bapco crap above). hardocp isn't running the same type of benchmark as anand/toms etc. "real-world gaming" is not the same as these guys do and it takes a lot longer than a canned test. Hence most just run a canned one.
http://www.hardocp.com/article/2008/02/11/benchmarking_benchmarks
I'm not talking running timed demos. You seem to like knowledge, I think you'll find this article very interesting and may tend to lean towards people who do this. I haven't been checking others recently but I'm not aware of anyone else doing this. It's just time consuming. If anyone knows another site that does "real world" test please link here :)
"We think our job is to explain the level of gaming experience you should expect when you purchase the video card we have evaluated. If we are suggesting to you that you purchase a $400 video card, I think we had damn sure better serve up a reason to you other than a timedemo, canned benchmark, or synthetic benchmark metric. Some people think video card value can be communicated by running simple timedemos; we contend it’s not that easy. It was at one time, but those days have passed. At least for HardOCP. "
Note anandtech, attacked...LOL. I'll also note I attacked Kyle once before. We had a virtual pissing contest a circa 2003 I think. Reviewers are so touchy when you point out things they have a hard time defending :) I have it saved somewhere on one of my drives...But that would take forever to find. I have 9 externals...ROFL. 2x3TB/4x1TB etc.
http://www.hardocp.com/article/2009/05/19/real_world_gameplay_cpu_scaling/
Another year later, discussing it. You can just search for real world testing on their site and get hits to read if desired. They have a lot of interesting stuff.

I'll forgive you now for not reading this :) You'll need a day 😉
I hope cleeve doesn't do this:
"Play nice or you're out of here, chief, along with all the posts you've ever written on this forum. 😉 "
There is nothing nasty in here, I just pointed out the facts. I never called him a name, etc. I pretty well left anything about blaz himself out, just argued with his data. Which amounts to one pic from a toms test and lots of statements using it as science.
blazorthon 09-16-2012 at 11:25:18 AM
"Different settings can affect performance greatly on some card while not so much on others. For example, using heavy MSAA doesn't hurt AMD's Radeon 7000 performance much whereas Nvidia takes a nose-dive when it's used. "
I hope he's done saying this stuff. That's all cards he's talking vs 7000's (all amd cards). But the comments I was talking about were worse. I think they've all been edited or just plain deleted.
blazorthon 09-22-2012 at 02:26:48 PM
"I'd also still be worried about the 660 Ti when you play with some good AA such as 4x MSAA and 8x MSAA. Even the 670 shows weakness in this, so I have no doubt that the 660 Ti would still do poorly in it. Even 1080p can show the 670 and especially the 660 Ti waning significantly in heavy AA while the Radeons with GCN GPUs just keep chugging along at about 85%-95% of the frame rates that they had without AA."
I really hope this is the last of these types of comments as I proved this patently wrong. I'm not saying AMD can't win a test, but they do not keep chugging along at 85-95 percent (now he's upping it to 95%...LOL). Nope it's a 80% HIT, not 80% efficient, never mind the upping for exaggerated effect. I showed 680 in previous wall vs. 1298 OC 7970. I showed the 660 in many articles with AA SUPER tests being the most quoted as it's directly against his statements. But many 660's reviews are in there and this article. No point in discussing his 670 dis, as if the 660/680 can be proven strong and not like he says, the 670 does the same thing and I am out of time :)
"I really like how the 7950 turns out in this and would not want anything from the Geforce GTX 600 series for a higher end gaming system, especially at 1080p. I really don't like playing at 1080p without some serious AA and Nvidia simply doesn't deliver in that. "
Seriously? That's the ENTIRE 600's series he has dogged...I think that's close enough to what I said he stated and that one was directed at me and my statements. Also an EDITED message by him. "Message edited by blazorthon on 09-22-2012 at 02:28:31 PM"
I think we're done here :) I have trainsignal/nugg stuff to get back to...so please accept my apologies if I don't get back quickly (if at all...Not worth banging my head over him any longer). I have a BBQ to build before friday too for company from CA. YUCK to the company...I rather like the idea of the new Weber Summit E-470. I smell steak :)
Enjoy your day :) Keep the windows shut....LOL
 
Army, Somebody, there's a thing that you two might have missed. I'm not going to read a book's worth of text to find out if this was rectified, but did anyone consider the possibility of blazorthon simply forgetting to add in the MS before the AA, a completely understandable mistake given that this is what people usually use in reference to AA. If that's the case, then blazorthon was correct.

So what, blazorthon edited some things. People make typos or want to add in things often and blazorthon is simply trying to ensure that what he said was accurate. You act as if it is a crime to fix mistakes. That's no worse than these ridiculously large and poorly formatted posts of yours that are difficult to read through.
 


So if it said MSAA you're saying he'd be correct, NVDA 600 series (all kepler) isn't worth having as they tank and 7000 series doesn't? Sorry it's too difficult for you to read (ok, it might be long but if you read it it's easy to follow). I was trying to write it as I was finding what army_ant7 requested. Let me make it really simple for you to understand. Every link (besides the financial ones) is pointing to the 7000 series cards TANKING in MSAA (8x and 4x depending on the game). Also 3/4 of the games are BELOW 30fps min. Jaquith made this point already - if the hit is so large it isn't playable you shouldn't be quoting this anyway - on both AMD & NVDA. Another person angry about the length of a post instead of reading it and giving an informed rebuttal with data to back it up.

Maybe you should try to muscle through it line by line. :)

Too difficult?

Just look at the pictures then. I put them there for people like you. Just read the summation at the end of each game at hardocp. There is no game in their tests where the 7000 series is playable and the 600 series is not. Just click the links. They're the ones that are underlined (most likely in blue). Your middle mouse button should open them in a new tab. Simple enough? I missed nothing. The entire post is about their financial troubles and MSAA tests. EVERY single game quote is about MSAA. The title/summary of the hardocp article is:

"NVIDIA GeForce GTX 660 Ti at High AA Settings Review
We take the GeForce GTX 660 Ti and compare it clock-for-clock with a GeForce GTX 670 to see what the memory bandwidth difference means at high AA settings. We also take the GALAXY GTX 660 Ti GC video card and compare to a Radeon HD 7950 w/Boost at high AA settings. "

That means the whole article is about MSAA. Every picture in their article is MSAA and has it at the top of each one. Do you have trouble reading AND seeing? His is very well formatted, what's wrong with hardocp's format? Every conclusion after every game says they are both either playable or unplayable. Not one is and one isn't. If one isn't playable neither was the other. If one is playable, so was the other. Get it? I don't think I can make that any simpler.

I didn't say it was criminal to edit posts. I merely pointed out that it's difficult to point out someone's statements when they are changing every post. Again factually correct. It's difficult to score a touchdown if the goalposts are constantly moving. Understand?

A long and ugly post is not wrong. By the very nature of what you're saying, a person EDITS their post to make it factually correct (you just said this, I agree). ERGO - SOMETHING IN IT IS WRONG or there would be NO EDIT.
You said:
"If that's the case, then blazorthon was correct."
Prove it. I did. Especially the finance stuff...ROFL. He at least seems to know something about PC's though I think I refuted his claims quite easily. He has no business talking finance without looking at a company's balance sheet, earnings, debt-to-equity, PE, EBITDA, TTM earnings etc. I could go on but I don't want it to be too long for you. :) If you followed the stock market you wouldn't touch AMD's stock with a 10ft pole. They gave up the x86 performance race with intel (link was provided!). If that isn't the exact opposite of what he said I don't know what is. They cannot compete with Intel, and I gave him a break by comparing them to NVDA (Intel dwarfs them). I provided a motley fool video saying the same thing (you can't HEAR either?). They are in trouble. Understand? The Gardner brothers are not FOOLS :) Therein lies the joke 😉 These guys are RESPECTED people in the financial community. Heck, you could argue that some of the people in their forums are the most intelligent out there. No fanboys there. They are ONLY fans of making money any way they can. Losing it, is against their religion :)
http://www.macroaxis.com/invest/compare/AMD,nvda,intc,txn,stm,qcom,mrvl,armh
Probability of AMD going bankrupt 49%
Probability of INTEL going bankrupt 2%.
Probability of NVDA going bankrupt 1%
You can't compete when you're chances of bankruptcy are 49% while your competitors are the best in the business. Qcom and ARMH are also 1%. IGMNF also 1%. I'm not making this stuff up. It's not good when financial people thing you're going bankrupt. Life is much more difficult when you're HIGH RISK.

One more time: PROVE IT. And use a site other than toms, as I'm not sure how they got their results based on other sites not showing the weakness (that isn't on both).

Jeez, just read the last page of Kyle's article.
http://hardocp.com/article/2012/08/27/nvidia_geforce_gtx_660_ti_at_high_aa_settings_review/8
Please refrain from making comments if there is nothing to back up what you say. I promise you, if you give me a "wall of text" I will read it. Count on it. But you can count on this too: If you are INCORRECT, I really don't care if it takes me 2 words or 10000, I'll rebut it with DATA not comments I can't prove or ridiculous statements about you or the length of your post. If the post was too long for you to read, don't read it. But don't come claiming Blaz is right if you didn't and can't refute anything more than the length.
 
blazorthon didn't say that Nvidia can't handle MSAA, only that AMD handles it far better, which is accurate. You're completely overreacting and in an overly offensive way.
http://hardocp.com/article/2012/08/27/nvidia_geforce_gtx_660_ti_at_high_aa_settings_review/2
Strait from your own links, this shows that the 660 Ti does in fact handle MSAA worse than the 670. The performance difference increased with 4x MSAA compared to 2xMSAA in an otherwise identical test.

blazorthon's Batman link already showed this is also true in Batman:AC.
However, HardOCP tested it even more thoroughly and I can't ignore this:
http://hardocp.com/article/2012/08/27/nvidia_geforce_gtx_660_ti_at_high_aa_settings_review/4
Clearly shows that not only does MSAA scale better with higher memory bandwidth, but so too does CSAA and as a higher level of AA than 8x MSAA, it shows this even greater than 8xMSAA, so it seems to behave similarly.

http://hardocp.com/article/2012/08/27/nvidia_geforce_gtx_660_ti_at_high_aa_settings_review/3
This test from the same review shows the same thing.

http://hardocp.com/article/2012/08/27/nvidia_geforce_gtx_660_ti_at_high_aa_settings_review/5
This shows that keeping the same AA settings, but dropping resolution, also has the same effect on performance as dropping AA at the same resolution does, another of blazorthon's points. It also shows that SSAA does not behave like CSAA and MSAA do, but this has been shown previously and is not a surprise. It's also generally too intensive for any graphics configuration with any game anyway, so although it's good for proving that you shouldn't generalize AA, but that's almost all that it's good for. Still, the 670's memory bandwidth and ROP advantage let it pull ahead of the 660 Ti despite SSAA seeming to not care about it so much, although when it was maxed out, the difference was, admittedly, less than 10% (unplayable on both cards regardless).

http://hardocp.com/article/2012/08/27/nvidia_geforce_gtx_660_ti_at_high_aa_settings_review/6
Here's a kicker. What this really shows is that the 660 Ti 2GB's memory bandwidth/capacity problem to a greater extent and how a much more highly clocked GPU can make a difference. What this doesn't show is that the 7950 would win more often and more significantly in an OC versus OC comparison and that the 3GB 660 Ti's have a distinct advantage over the 2GB models when more than about 1.5GB of memory is in use. As has been said by others, this is caused by the last .5GB (or more accurately, .5GiB) running at a third of the memory bandwidth of the first 1.5GiB of memory, a problem that the 3GiB models rectify.

So, although blazorthon may or may not have over-exaggerated the impact of memory bandwidth, blazorthon was correct in that it can matter greatly. For example, the 670 has a 33% memory bandwidth and ROP count advantage over the 660 Ti, yet in the heavy AA tests, it generally had a 15-25% performance advantage in average FPS over the 660 Ti. For a 33% advantage, that's quite significant. The 7950w/B's comparatively poor showing despite more memory is not only attributable to the 660 Ti 3GB's faster GPU, but also that Nvidia gets somewhat more memory bandwidth with the same memory configuration, kinda like how modern consumer Intel platforms get more bandwidth out of the same memory configuration than AMD does with any of their consumer platforms. Basically, AMD has higher bandwidth, but not to such an extent as the numbers may lead you to believe. I'll try to find my links that show this, so please bear with me while I look.

Regardless, HardOCP has already shown that even junk 7950s such as the XFX models can generally beat the best GTX 660 Ti models in average gaming performance unless you really hate high levels of AA (obviously excluding SSAA and since no amount of FXAA is a high amount of AA, also excluding FXAA).
 

Lucky for you then that WinRAR recently got (better) multi-threading support. I'm not totally sure, but I remember that previously it either only used one thread or it wasn't threaded nicely, unlike the 7zip application, but that could only unRAR. This is irrelevant if you're using any other RAR encoder, of which WinRAR is the only one I know. :) Aside from that, plain core performance is also a big factor.


Are you implying that blaz is playing dirty here? And I mean changing the points of his posts rather than just clearing typos or making his points more understandable.

I for one like the edit system since you don't necessarily have to add more bulk to the thread just to correct an error and it could also lead to less wrong info/typos taking up space. The downside to it is of a more personal nature I would say. Like when you're claiming someone said something but they he edits his post (which seems to be your problem with this). Facebook had the (good) idea of implementing an edit history available for posts that were edited.

BTW, if you're concerned about your post seeming like a wall, then you could use the quote system more granularly (i.e. quoting a post a segment at a time as you address it. Yes, it would make the post technically longer because of the way it shows up (the box the quote is in and stuff), but it would make it seem less like a wall and easier on the reader's mind, making it seem easier to read. That may just be me though...


Again, when did blaz say that the 600 series tanks with (MS)AA? Worse of a hit with AA on doesn't necessarily mean it tanks.

"A 70% drop"? What are you referring to? The comparison of the max to min frames at a given setting? If you're using this against the efficiency percentages then you got it confused. What he was referring to was comparing frame rates with and without x8 MSAA, not with max and min frames.
Unfortunately, that article didn't provide numbers for the HD7950 and GTX 660Ti, with and without AA at a certain resolution. It did have both though with x4 and x8 MSAA at 2560x1600. Since it did say the HD 7950 performs worse in BM:AC with tessellation on, min frames are expected to be lower I would think. With that in mind though, compare how much average framerate was lost in average from x4 to x8 MSAA. The HD 7950 worked at 86.2% and the GTX 660Ti at 74.3% of what they did on x4 MSAA. A bigger hit for the latter, seemingly attributable to its lower memory bandwidth.

Skyrim has 1080p and 1600p both at x8 MSAA shown in that article. Though there isn't a variation in AA level, resolution also values memory bandwidth I believe (more pixels = more data). Again, it seems like you're comparing max and min framerates with that 50% figure of yours, which I don't really see how it would show how memory bandwidth affects a game. Let's again compare the two cards with the two settings. Comparing the min, avg, and max framerates respectively, between the two resolutions.
HD 7950: 69%; 73.2%; 74%
GTX 660Ti: 62%; 64.8%; 71%
The 660Ti takes bigger hits as you can see. See how memory bandwidth can affect performance scaling (significantly or not)? We all know how bottlenecks work.

About memory bandwidth having no bearing on the 660Ti in gaming, I'd have to disagree with that. Those points I shared above along with what luciferino said. Unless you're saying that the 660Ti had worse scaling for other reasons (like not having that much processing prowess with certain types of graphics workloads).


Even if their current financial situation is bad (dug into debt and stuff) this doesn't mean that they can't use the money they have right now to be more competitive and hopefully eventually climb out of debt right? You could've argued with blaz about if those billions are in fact enough, but just saying that they aren't doing well right now doesn't really prove that wrong. You'd have to get into the costs of things involved in being competitive to be able to argue here, but finances aren't the only factor but also the quality of the people in AMD (e.g. innovative/smart engineers). As I remember, blaz was pointing out how they aren't being competitive (in the x86 market) enough thus, why they may be in the financial state they are now. He was saying they can do better with what they have right now, but they aren't (or hopefully just have yet to).

He also didn't say they were in a good position right now. He just said they have the capability to compete (and hopefully they could get to that good financial position)! I'd like to add that if heterogeneous computing kicks off and the value of AMD's shares skyrocket, a lot of those investors would be very happy people. High risk, yes, but in this business, how else can you expect such big yields?


Again, you're making wrong comparisons to what blaz claimed. This may be why you're so outraged. He was talking about efficiency. I'm really not sure how a ratio of min to max shows any kind of efficiency and even more so a memory bandwidth-related one.

Your quote may not do justice for the Cleeve-jaquith-blaz thing. For instance, it seems that Cleeve called jaquith a troll because he was being unreasonable so much that it pointed to trolling. Something like this is stated in that same post from where you got that quote of Cleeve.

I remember reading through that scuffle during a time when it was still fresh, and, honestly, I don't think jaquith was totally wrong with whatever he/she was saying, though it might've been more of an attitude problem, not to mention forcing his/her own preferences.


Interesting info. TH is pretty much the only site I follow to this extent. I'm not sure what may have happened back then, but I hope TH is clean today, for this is where I put my faith, along with the forum people here, like ourselves. I also go on other sites occasionally to fill in information gaps since TH isn't perfect and that's quite alright.


Putting out a "magical chip" at a "magical" price to be able to make a profit and survive is the idea. Though, it may not be as magical as you may think. :)

Debt can be, but is not absolutely/always dumb. Borrowing your capital to start a business isn't dumb. Getting a mortgage to be able to have a house isn't dumb. Having a payment plan to have a car or any other thing isn't dumb. What can be dumb is not planning responsibly on how to pay that off that debt and also forgetting that you actually have one which may be shown by your further spending actions.

Like you what you said about blaz concerning finance (which I don't agree with), you yourself may need a lesson in (macro)economics. I think a country borrowing money isn't necessarily dumb. Like with businesses, it depends on how you plan to use this money you borrowed. As I remember, there is actually a certain economic school of thought that encourages countries to borrow money. (I forgot if it was Keynesian economics.) The idea may have been that with that money, you'd improve your country (e.g. educate your people and thus have them being more productive) then you'd eventually generate more money with this investment which you could use to eventually pay off the debt, pretty much like a business.


I am wondering about GCN's tessellation performance compared to Kepler's, though as I remember from a comparison article here on TH, the HD 7970 didn't necessarily trump the GTX 680 with tessellation i.e. GTX 680 still got more frames. It just didn't experience as much of a drop in performance when it was on compared to when it was off, i.e. efficiency (I hope you don't confuse this again with a min to max framerate ratio like you did with MSAA.).

It seems like we have different definitions of "real-world performance," because, as far as I'm concerned, "real-world performance" refers to performance data derived from any application actually used by people, games and productivity apps for example. Now, some people might actually derive fun from certain synthetic benchmarks, just like they do games (I did enjoy walking around in Heaven Benchmark. Don't judge me!!! :lol:). For those people, would anyone think I'm wrong when I say that the performance results for those synthetic benchmarks pretty much become "real-world performance" numbers for them in essence? After all, what is a game or entertainment application other than a tool for fun?

If you don't consider timed demos or repeatedly ran scripts in specific scenarios in games (what I would call "real-world" benchmarks) to be able to indicate "real-world information," then what can? I'd have to contest you there since even though these benchmarks only show performance under a certain scenario in a game, they are data of that specific scenario nevertheless, and they do tend to indicate the overall performance of the rest of the game, depending on how these benchmarks are made. What else could we base our purchasing decisions on (one of the main, if not the main reason these benchmarks are made)? When you were talking about it in your post above, you gave benchmarks that were made to saturate memory bandwidth. That sounded like a synthetic benchmark.

I'm not sure what you consider a "real-world" test. Playing through a whole game at certain settings with a certain graphics card and seeing how it performed for example? You could consider that to be one big timed demo in itself if you look at it. It can't be helped that people end up not doing the exact same thing a timed demo demonstrates, but having a sample is a good thing to have compared to nothing. For one thing, comparison between components would probably inaccurate to a certain extent because someone not running a scripted demo would probably end up doing things differently each test run thus likely resulting in different test figures.


I'm sorry, but you're either lying or mistaken with implying that you didn't attack blaz. A lot of things you said above were insults to him and his abilities. I wouldn't take this against you, but saying that you didn't is another story.

I don't think he would be done saying that stuff as they do seem to be credible. Again, take note of his statement. He's saying that the 7000 series cards take less of a hit compared to the 600 series cards when turning on/increasing (MS)AA due to memory bandwidth differences. Again, you've been getting percentages derived from the min and max framerates of certain game settings, when you should've been getting percentages from framerates of games when AA was on and off. With that in mind, I wouldn't say he was wrong for saying that the GTX 660Ti and less so, the GTX 670, have a weakness when it comes to (MS)AA. I would consider it a relative weakness that they aren't as efficient with MSAA as the competition, but that one relative weakness doesn't mean they perform worse at certain settings when it comes down to it. Maybe he could've been more specific as to say "MSAA" and you would be right to reprimand him for that if what he said didn't apply to SSAA, FXAA, etc.

If blaz prefers the HD 7950 compared to any cards Nvidia offers, he is entitled to his own preference. He didn't tell us not to get Nvidia cards. He said "I...would not want anything from the Geforce GTX 600 series for a higher end gaming system..." This thing he said may be more questionable "I really don't like playing at 1080p without some serious AA and Nvidia simply doesn't deliver in that," but again, he is entitled to his own opinion which may have been influenced by his choice of games and settings. If the 7000 series performs better for his needs, then maybe that's why "Nvidia simply doesn't deliver" for him. What "delivers" or not is subjective. Though again, that last statement of his is debatable and it's quite fine that you spoke up against it.


You seemed very offensive and also overconfident with your statements throughout this (and other) post(s) of yours . Remember, that whenever anyone does what you did along with bragging, they just open up a way for them to eat their words (later on). That's why I try to be safe with what I say. I try not being absolute i.e. being open to other people's thoughts thus acknowledging that I can be wrong, which is only normal for humans like us. Humility and tactfulness are virtues for a reason. :)

Sorry, but you got sidetracked a lot and made/implied false accusations. If you're gonna try to refute someone, you have to work against what they said and just focus on that. You've obviously put a lot of effort/work/energy in these posts, but no matter how much you do put into it, if you miss your target or aim in the wrong direction, it would be to no avail.

I'm not saying every single one of your points were wrong. A lot were factual. The ones that didn't seem so might've been mostly your claims against blaz. Anyway, they might've been factual and useful, but most of it may have been irrelevant to actually proving how the guy was wrong...

I hope you don't start hating me for this. I don't like "losing" people who have been nice to me. :)

(Whew! This took me a few hours to write as it seems. Well, that goes to show that I don't really have anything better to do, but I still think it's worth it as long as at least one person gets to read it. :))
 
blazorthon didn't say that Nvidia can't handle MSAA, only that AMD handles it far better, which is accurate. You're completely overreacting and in an overly offensive way.
http://hardocp.com/article/2012/08/27/nvidia_geforce_gtx_660_ti_at_high_aa_settings_review/2
Strait from your own links, this shows that the 660 Ti does in fact handle MSAA worse than the 670. The performance difference increased with 4x MSAA compared to 2xMSAA in an otherwise identical test.

Not quite sure you even understand what this argument is about. I never said 660TI scales better than 670 or 680. They will perform in that order. I'm not even sure it's worth answering this as your whole premise is based on something that isn't really happening :) The 680 is faster than a 670, which is faster than 660TI...I'm confused.
HE said:
"Different settings can affect performance greatly on some card while not so much on others. For example, using heavy MSAA doesn't hurt AMD's Radeon 7000 performance much whereas Nvidia takes a nose-dive when it's used. "
blazorthon 09-16-2012 at 11:25:18 AM "
That's all NV cards he's talking vs 7000's (all amd cards). NV does not take a nosedive as every game I showed is clearly showing. If it nosedived I'd expect all victories for 7950Boost vs. the 660TI. You really didn't read my stuff or his did you?

HE said:
"I'd also still be worried about the 660 Ti when you play with some good AA such as 4x MSAA and 8x MSAA. Even the 670 shows weakness in this, so I have no doubt that the 660 Ti would still do poorly in it. Even 1080p can show the 670 and especially the 660 Ti waning significantly in heavy AA while the Radeons with GCN GPUs just keep chugging along at about 85%-95% of the frame rates that they had without AA."
blazorthon 09-22-2012 at 02:26:48 PM
He's saying the 600 series performs less than the 7000 series radeons AGAIN.

HE said:
"I really like how the 7950 turns out in this and would not want anything from the Geforce GTX 600 series for a higher end gaming system, especially at 1080p. I really don't like playing at 1080p without some serious AA and Nvidia simply doesn't deliver in that. "
"Message edited by blazorthon on 09-22-2012 at 02:28:31 PM"
Seriously? That's the ENTIRE 600's series he has dogged vs. just the 7950 at 1080P (1920x1080). This is not a discussion about the 660TI not scaling as well as a 670. I thought we all knew that already. OOB the 660 will lose to the 670. The 670 will lose to 680. But that isn't what this is about. It's about 600's TANKING vs. 7950 (or heck even the rest). You don't even understand what is being talked about. You clearly didn't read my post. The bottom of my post to army_ant7 has all 3 of these quotes with times posted. I'm pretty sure army and Blaz both understand exactly what I'm pointing out, and there is nothing to argue about here. It is WRONG. You didn't even read my post to you...LOL.

I said this to you:"There is no game in their tests where the 7000 series is playable and the 600 series is not. Just click the links."
Which really is what this is about as noted from his quotes. Read the conclusion page at hardocp as I requested you to do. It's page 8. It CLEARLY states memory NEVER affected the 660TI despite the memory differences vs. 7950Boost. It also clearly states they never hit a "VRAM WALL". Kyle even points it out.
KYLE said:
"We found that the 2GB GTX 660 Ti kept up with the GTX 670 quite well, even at high AA settings. Our averages in games varied, and the GTX 670 was always the faster of the two maintaining a performance lead. What didn't happen was that the GTX 660 Ti never hit a "VRAM wall;" even at extreme AA settings we were able to get through our run-throughs."
The 2GB 660TI NEVER HIT A WALL even in EXTREME AA settings. Never mind the 3GB, as he points that out too. NO WALL, NO advantage for 7950B vs. EITHER 2GB or 3GB 660TI.

http://hardocp.com/article/2012/08/27/nvidia_geforce_gtx_660_ti_at_high_aa_settings_review/6
Here's a kicker. What this really shows is that the 660 Ti 2GB's memory bandwidth/capacity problem to a greater extent and how a much more highly clocked GPU can make a difference. What this doesn't show is that the 7950 would win more often and more significantly in an OC versus OC comparison and that the 3GB 660 Ti's have a distinct advantage over the 2GB models when more than about 1.5GB of memory is in use. As has been said by others, this is caused by the last .5GB (or more accurately, .5GiB) running at a third of the memory bandwidth of the first 1.5GiB of memory, a problem that the 3GiB models rectify.
Read my post to Army. Nothing changed with 7970 clocked at 1280 vs. 680, Kyle said in his superclocked comparison that it SWEPT the floor with AMD's cards even against the REF model.
Also read this article about 4GB at guru3d http://www.guru3d.com/articles_pages/palit_geforce_gtx_680_4gb_jetstream_review,18.html
Palit 2GB 680 vs. Palit 4GB 680...He tested this at highest settings, the fun starts at this page 18 in DX11 stuff specifically set up to show 4GB beats a 2GB...LOL. Nothing happened. Despite Hilbert's efforts to find a reason to spend the extra $90 he failed.
From the article:
"I had hoped that the 4Gb version would show off a little, but perf remains the same at all tested resolutions."
Hilbert is WELL known for his excellent reviews with in depth analysis attempting to prove a specific situation or issue or product quality. His motherboard reviews are quite awesome getting right down to the types of capacitors, vrm's etc, chosen chips and why they are good or bad...I love his reviews. Here's a sample:
"The Maximus V Extreme comes with 10k Black Metallic capacitors, offer a 5x longer lifespan with 10k hours at 105C, and 20% better low temperature endurance - specifically selected for extreme cooling scenarios. The overall choice on components is the best of the best really. But let's zoom in a little." He proceeds to "zoom in" so to speak... :) He usually gives FAR more detail than most would want. Which is incidentally, my favorite part about reading his articles even when I disagree. I love his details.

Another quote from Hilbert in conclusion page 26
"The 4GB -- Realistically there was not one game that we tested that could benefit from the two extra GB's of graphics memory. Even at 2560x1600 (which is a massive 4 Mpixels resolution) there was just no measurable difference.
Now the setup could benefit from triple monitor setups at 5760x1080 (which is a 6 Mpixels resolution), but even there I doubt if 4 GB is really something you'd need to spend money on. It might make a difference at 16xAA and the most stringent games, or if you game in 3D Stereo and triple monitor gaming"

Nothing happened, and he can't even justify it on this page, even saying (and he's just guessing), it MAY make a difference in triple monitors in 3D (again just guessing, because NOTHING he can run proved any difference up to maxed out 2560x1600). Also note the 4GB ALWAYS lost to the 2GB...ROFLMAO. IN EVERYTHING. Even though the argument isn't about this, I still proved you wrong. You can't even get your incorrect argument correct. Jeez. He is talking maybe making a difference in a hypothetical 6mp resolution. Hardocp did an article in 5760x1200 also, proving 680 wiped the floor with 7970 even at this res, so not sure it matters.

So, although blazorthon may or may not have over-exaggerated the impact of memory bandwidth, blazorthon was correct in that it can matter greatly. For example, the 670 has a 33% memory bandwidth and ROP count advantage over the 660 Ti, yet in the heavy AA tests, it generally had a 15-25% performance advantage in average FPS over the 660 Ti. For a 33% advantage, that's quite significant. The 7950w/B's comparatively poor showing despite more memory is not only attributable to the 660 Ti 3GB's faster GPU, but also that Nvidia gets somewhat more memory bandwidth with the same memory configuration, kinda like how modern consumer Intel platforms get more bandwidth out of the same memory configuration than AMD does with any of their consumer platforms. Basically, AMD has higher bandwidth, but not to such an extent as the numbers may lead you to believe. I'll try to find my links that show this, so please bear with me while I look.
Are you having trouble reading the parts that are PLAYABLE? AT no point in 1920x1080 in the article you are quoting (my AA article - Pages 2-4), does the 670 beat the 660 by more than 16.2%. It's not 15-25%. You are exaggerating. The only time it gets above 20 is at UNPLAYABLE settings on BOTH cards. Is there any point in saying X beats Y when they both are running 15-17fps? Seriously? I don't care if you can show a difference at 5760x1200. Of course I can force 2 cards into a situation where I can finally show one beating the other. If the 660TI is running 1fps and the 670 is running 2fps you've proven the 670 is 2x faster here. But IT ISN'T PLAYABLE SO I DON'T CARE.
max payne, batman AC, skyrim, battlefield 3. NONE are more than 16.2% different at playable settings. So don't quote 15-25% when it is USELESS to run where you can prove your case. Again, this thing isn't even about this, but you can't be bothered to figure that out, or even make a useful case with what you did say.

Regardless, HardOCP has already shown that even junk 7950s such as the XFX models can generally beat the best GTX 660 Ti models in average gaming performance unless you really hate high levels of AA (obviously excluding SSAA and since no amount of FXAA is a high amount of AA, also excluding FXAA).
Umm...Are we reading the same hardocp site? You just said a JUNK 7950 can beat the BEST 660TI? While kyle keys on avg performance he also shows the minimums. But I defy you to show me a JUNK 7950 beating the best (or heck any OC 660TI) 660TI at hardocp. PAGE 8, KYLE'S CONCLUSION:
"Our performance results were not too far apart in most of our gaming between the Radeon HD 7950 w/Boost and GALAXY GTX 660 Ti GC at high AA settings. We often found both video cards with fewer than 10% performance difference. We saw a few instances where the Radeon HD 7950 w/Boost was faster, and we even saw a few instances where the GALAXY GTX 660 Ti GC was faster at higher settings. So much for the extreme memory bandwidth advantage on the Radeon HD 7950."

Another quote from the same page:
"Benchmarks and the like cannot tell you this real-world information, and can be extremely misleading. Take for example a test that stresses memory bandwidth and fills it to the brim, sure, the 7950 would win that test, but that test doesn't translate to what we just found out in real-world gaming. Performances were close between these cards, and in some cases faster on the card with the lesser bus and bandwidth. So focus on what real-world gaming tells you."

Both statements say pretty much 1/2 went to the 7950 BOOST (by definition this isn't a JUNK card it is AMD's BOOSTED edition) and 1/2 went to the (kyle's words) "card with the lesser bus and bandwidth."...LOL. Umm, you're wrong. Again, Not even sure why I'm having to discuss something that isn't even what this entire conversation between me, blaz, and Army is about, but you're wrong at every turn. I'm not even going to bother responding to you again. You are not even arguing about our conversation. You're having a conversation with some "other" me who I'm not aware of... :) Is my post pretty enough for you this time? It doesn't change a thing but isn't it formatted pretty? :)

I digress...Please don't waste my time again. Good that you didn't argue about the AMD can compete with Intel that Blaz said...LOL. I will gladly BURY anyone who wants to tell me AMD can financially compete with Intel as I have already done even vs. Nvidia. Inserting INTC into that commentary makes my statements ridiculously friendly to Blaz's misrepresentation of AMD's finances. INTC's profits AND sales AND market cap are 11x NVDA. I still can't believe he said that. Are we done yet?

I was never offensive to anyone, unless you don't like data and long posts (I see you don't). If I can't say someone is WRONG and PROVE IT, without being offensive, I'm not sure what to do or how to word it to make you happy. You attacked my formatting in the previous post (as usual when someone can't argue with the data) and put words in my mouth as if I'd said he committed a crime for editing. I merely explained why it's difficult to quote someone when their posts are changing. This was in defense of me not coming up with what Army_any7 wanted...Though I found stuff that gave him exactly what he wanted anyway - precise quotes where he unanimously declared 600 series is bad and tanks vs. 7000 series. I hope this is easy enough for you to follow and you don't start misquoting me again (or leaving out blaz's REAL claims, thus misquoting him too). If you find it offensive I've proven YOU & blaz wrong, well, that's your opinion about ME and you're entitled to it I guess. :) The devil's in the details eh? Love the handle. I tried to get nobodyspecial but it was taken...LOL...Have a good night.
 

What you consider underclocked and overclocked may be different from what other people consider them to be, because last time I checked, any card sold with clocks higher than the reference clock is "factory overclocked," and manufacturers have to do things like have chips binned good enough, possibly have to increase voltages, provide warranty for a card that may wear out faster.
Are you claiming that if one manufacturer puts out a card that comes OOB with a high clock, every other card in the market suddenly turns underclocked? If you claim that the normal clock is the most common clock of the cards found on the market, what if stock shifts and and cards with a higher clock become the most common, do the lower clocked cards suddenly become underclocked?

Anyway, based on what you were saying, blaz was just suggesting that your analogy may have been better worded the other way around and with different figures. You said that you wanted to see the cars being ran at 200+mph, which is pretty much their max speed (i.e. limit) if you'd look at the two figures you brought up 202 and 211mph, and see if they would handle well, if the engine blows up, etc.. If we were to put this in terms of cards, this analogy would imply that you want TH to overclock the cards (close) to their limit and see their characteristics (like temperature, durability, and stability in general). That's why your analogy may have been false. blaz gave more moderate figures like 65mph and 55mph which don't need to be accurate since it's just an analogy (The way you took these figures against him, it seems like you were the one playing with semantics, if not, something similar in nature.). What a something is "intended" to run at is subjective. For one thing, I don't think a car is intended to run at the max speed or anywhere close all the time. I could imagine it having a part or multiple fail sooner than expected if that's the case, and that doesn't seem to jive with the word "intended." Anyway, if a graphics card that comes factory overclocked is intended to run at that clock setting, then that moderate number, 65mph, sounds like reasonable comparison to me. The card could probably be overclocked more (including overvolting) up to a certain point just like how the car can be run faster, but the manufacturer of the card "intended" for it to run at a certain speed, which is proven by the warranty they provide for that speed. I'm don't think cars have any specific speeds that they are "intended" to be run at, but with that point of mine above, I doubt it's their max speed. The 55mph figure seems analogous to how TH underclocked the factory overclocked card. blaz's analogy seems more sound than yours did, and even though we did get what you meant even without that analogy, him suggesting a better analogy wasn't just a matter of semantics I would say.

If you can't overclock a car, then why use cars in an analogy to this topic of ours? I would say you're the one who missed that point of this article. It's not about specific models (or what they're clocked at) but about the GTX 650 and 660 (at reference settings of course since they are called "reference" for a reason). What you should take note of is that overclocking is just increasing the max clock of a card (increasing how much it can be utilized by pretty much), if a certain model of car can be bought with a V6 engine and then you modify it by replacing it with a V8 engine, then maybe you can compare that to overclocking. If the manufacturer already offers a V8 version then you may consider that as being factory overclocked.

I would say the speed of car isn't analogous to a card's clock, but more so to its performance (framerates). You can drive/utilize a car under its max speed and you can run/utilize a card below what its current clock offers.

Sorry, I just had to give my opinion on this... :)
 

You do seemed confused about that first quote. He isn't saying that you said the GTX 660Ti scales better in MSAA than the GTX 670. In fact, he/she's showing the opposite and also that the current AMD cards scales better than those. If you're confused as to what he's saying, you shouldn't say that about his premise.

A "nosedive" is a nosedive, but that doesn't mean the GTX 600 cards nosedived passed the HD 7000 cards in performance. blaz didn't say that.
He said that he had no doubt the GTX 660Ti would do poorly with those settings. Where did he say it would do poorly though? Maybe he was just referring to that chart he shared?
Like I said in a previous post, whether or not the 600 series would "deliver" compared to the HD 7950 is dependent on what he considers to "deliver." He used "I" in both statements.
You're assuming too much.


Those things that "never" happened were in the scope of just his article. That doesn't mean they would never happen objectively.

Why talk about the HD 7970 and GTX 680 when luciferano was talking about the HD 7950 and the GTX 660Ti? Also, it's his source against yours. It just goes to show that there are different sources out there with different results, and it also shows how results can vary based on the different variables (e.g. settings) that come into play. How can you claim he/she's wrong like that? luciferano brought up an article and if it shows that the GTX 660Ti with 4GB performed better, then there you go. Remember, what ever source you share only has data within the scope of that article, i.e. the specific tests they performed.

You keep on saying that people aren't reading what you're writing, but are you doing the same? So what if they are playable or not? That's aside from the point about memory bandwidth having a substantial effect. If the person wants to show a range (15%-25%) he/she isn't wrong in doing so. He/she isn't exaggerating, but just being factual. Pointing out that the difference never exceeds 16.2% at playable settings is additional data and is welcome and appreciable. :)



I can't defend luciferano about that claim of a junk HD 7950 generally beating even the best of the GTX 660Ti's with high levels of AA. He/she better come up/specify an article that proves this claim. Anything other than that, I think I've addressed pretty much all of what you said in my comment before my previous comment. :)
Here's a tip, if you feel that someone is being offensive to you or seems like he/she's mocking you, call them out on it but don't stoop to their level. I mean, just now you said these:
Umm, you're wrong. Again, Not even sure why I'm having to discuss something that isn't even what this entire conversation between me, blaz, and Army is about, but you're wrong at every turn. I'm not even going to bother responding to you again. You are not even arguing about our conversation. You're having a conversation with some "other" me who I'm not aware of... :) Is my post pretty enough for you this time? It doesn't change a thing but isn't it formatted pretty? :)
Instead of firmly saying he/she was wrong, you could've asked what luciferano was referring to. I mean, if he/she does have an article to show, that would make luciferano right about his/her statements. And in case luciferano is right, he/she might take offense to you calling him/her "wrong."
You might also be making luciferano feel unworthy to join in the conversation which he/she should be free to do since this is an open forum and not a private group discussion. Again, that could be taken as offensive.
luciferano could be right about somethings (which I have pointed out above) yet you tell him/her that he/she's "wrong at every turn." Again, another insult especially if he/she is right about something.
Your last 3 sentences that in the quote are just childish in my opinion. Even though you may have taken offense as to luciferano's comment about your formatting, like you said, it sometimes can't be helped, though I don't think luciferano was being childish with that comment, and whether or not he/she was, you should know better than to be like that. You should also avoid those because it kinda makes you seem like a troll.

Those are some of the things that you may be able to improve upon so as to come out as less offensive to others. I know that you may not (just may not) and don't have to care about how other people here feel, but being as tactful as we can can help us get our points across and that's really the main purpose of this forum. You aren't really giving crap points and I'd like to have you around the forums more, personally, but we could all use some improvement at times. :)
 
Lucky for you then that WinRAR recently got (better) multi-threading support. I'm not totally sure, but I remember that previously it either only used one thread or it wasn't threaded nicely, unlike the 7zip application, but that could only unRAR. This is irrelevant if you're using any other RAR encoder, of which WinRAR is the only one I know. :) Aside from that, plain core performance is also a big factor.
Yes winrar uses it and 7zip comment is partially why I don't use it, as winrar is the defacto std on the usenet I use it and like the recent addition of multicore (same with multipar, which is quickpar with multicore). Some stuff is 7z (more these days) on the net, but usually it's winrar. I have 7zip installed for those occasions too though :) Can't beat sourceforge stuff :) Quite a few good apps there.

Are you implying that blaz is playing dirty here? And I mean changing the points of his posts rather than just clearing typos or making his points more understandable.
Implying I don't know either way, but found what you wanted anyway or close enough to be exactly what was required, though you've already missed it twice. 😉


Like when you're claiming someone said something but they he edits his post (which seems to be your problem with this). Facebook had the (good) idea of implementing an edit history available for posts that were edited.
Agreed, that would solve any issues. Unfortunately we don't have that here. (yet?)

BTW, if you're concerned about your post seeming like a wall, then you could use the quote system more granularly (i.e. quoting a post a segment at a time as you address it. Yes, it would make the post technically longer because of the way it shows up (the box the quote is in and stuff), but it would make it seem less like a wall and easier on the reader's mind, making it seem easier to read. That may just be me though...
In trying to keep it from taking hours to post (as you noted) I just took the lazy route, figure the data spoke for itself no matter how ugly :) I guess we posted at the same time... (I saw another email just come in I'm guessing I might be off knowing it was you I think and who knows what you said at this point..LOL). You can see from my last reply to luciferano I did this already it was me being lazy not being ignorant :) I'm in a hurry a bit and shouldn't even be taking the time for this :) I didn't even bother to check for grammar/spelling (I saw an I'm that should be I but who cares :)). I'm sure there are errors. I don't want to edit it after making comments about how many he did. :)


Again, when did blaz say that the 600 series tanks with (MS)AA? Worse of a hit with AA on doesn't necessarily mean it tanks.
See post to luciferano...I put all 3 quotes near the end (for the 3rd time I've done it, with timestamps on each in each post) :)


"A 70% drop"? What are you referring to? The comparison of the max to min frames at a given setting? If you're using this against the efficiency percentages then you got it confused. What he was referring to was comparing frame rates with and without x8 MSAA, not with max and min frames.
I understand exactly what he said. I could have went back and made it more clear (again in a rush, and these are freaky huge posts to keep straight man! LOL), but my point was what I said dropping from 95 to 22 is "unacceptable and unplayable", which was really what I was meaning to say. Sorry if I didn't make that clear. With as long as these are, I didn't even correct a thing or re-read before posting it. This one either. :) Jaquith said the same, if the drop is to unplayable it isn't worth the hit (no matter what it is). Also wild swings like this is hard on the eyes for a portion of people that notice it's kind of like watching a flickering show.

Unfortunately, that article didn't provide numbers for the HD7950 and GTX 660Ti, with and without AA at a certain resolution. It did have both though with x4 and x8 MSAA at 2560x1600. Since it did say the HD 7950 performs worse in BM:AC with tessellation on, min frames are expected to be lower I would think. With that in mind though, compare how much average framerate was lost in average from x4 to x8 MSAA. The HD 7950 worked at 86.2% and the GTX 660Ti at 74.3% of what they did on x4 MSAA. A bigger hit for the latter, seemingly attributable to its lower memory bandwidth.

From Kyle: "Important FACT: Neither video card is playable at this setting, both performances are under 30 FPS for a lot of the run-through. "
This is my point. Why even discuss things that don't matter? In this game the 660TI scores 16fps min, while the 7950B scores 14fps. As I stated before the fact that 660ti is faster here is POINTLESS. Correct? Dropping from 8x to 4x on both produces 17fps (not a tank correct? it only went up 1fps, or down the other way 1fps), and the 7950B increased to 17fps. Ok that took a larger hit correct? It dropped (or gained?) 3fps from 17 to 14 from 4x going UP to 8x. Unless my calculator isn't working that's an 18% hit. Not 95% efficient. More like 82, but that's beside the point that I was really making. However the NV card here is 94% efficient thus negating what he said anyway. None of this means anything as they are both dropping with wild swings (both tied at 64fps max, and again tied here at 17min - LOL...total tie...I just realized that), and again unacceptable & unplayable as I already said. The swing in fps is massive, and both useless to quote, though again no loss for NV even if you could play here.

Dropping it down to 1920x1080 doesn't help either here:
My point? 660TI min 28, Radeon 7950B 22min in that test you cite even when dropped down to 1920x1080. Do I really care if NV drops more if they win by 22% in a resolution/setting I consider unplayable on both? Don't forget the 7950B here LOSES. But that isn't the point. NV doesn't tank more than AMD when AMD loses anyway. That's semantics if the point is still they lost correct?
http://hardocp.com/image.html?image=MTM0NjA2MDA5M09tVGRTMlE0eDNfN18zX2wuZ2lm
Note that the 7950B is winning in max (95vs. 80..LOL), but drops to "unacceptable and unplayable" (to quote myself again) at 22fps LOSING to the NV 660TI at 28fps. While I'd call that unplayable on both as this is only a small snapshot and even with the length of their "real-world" test in actual gameplay, I believe over 3hrs of fun I'd be complaing EVEN about the 660TI as it dipped occasionally BELOW even their 22% faster 28fps. The 7950B here however is totally unplayable. Thus an instance some may argue is playable on 660TI. This is the WRONG game from blaz to quote or you to defend, no matter how you slice it unplayable at 22fps is just that. UNPLAYABLE. AMD seems to have a problem with large swings in fps from min to max. At times making it unplayable to a lot of people, and I'd say to ANYONE in this game vs. 660TI here. 30fps has been the "magic number" for years. I'd argue 40 knowing how far some things dip, and some are now saying 60fps for the same reason. Guru3d goes into this in his review. He considers 30-40 playable for this, and 40-50 should never see a slowdown (he does say go above it). Though I think I've proven that not entirely correct. 95 to 22fps drop? He better rethink that :) But it's his opinion 😉 And mostly correct, people have to have some guidelines in a review. He went further with above 50, 100 etc. So still good stuff taken as a whole. If my min drops below 60, I buy more card or adjust. But then I don't like getting killed because I couldn't react fast enough in multiplayer with craploads of stuff on the screen. Though I haven't played for a while online that's where I formed my opinion on this. Professional gamers (god help you if you get in a room with them dudes) have a MUCH higher number in their head than 60fps even.

From Kyle's review:
"Important FACT: Both video cards are playable at this setting, but the HD 7950 does lag a bit during tessellated scenes."
You said:
As I remember, the supposed tessellation advantage of the GCN cards over the Kepler cards
I know of no tessellation advantage of the GCN over kepler. I think it is the other way as Kyle points out in his 680 & 660 reviews as noted. GCN has a gpgpu advantage. NV chose to key on games which is why the 680 wipes the floor with the 7970 even when overclocked to 1280 as I quoted previous from Kyles superclocking review of 680 (lighting card if memory serves). He had the ref in there too, which was also quoted as beating the heavily OC'd 7970 card enough to for kyle to say they both wiped the floor with it. They cut out gpgpu as most don't bitcoin or fold@home (for apps, cuda fixes a lot of disparity here anyway - PS plugins as an example). I'm not sure you can make much today with the former as the easy blocks are gone, and I don't even understand why people would fold@home running up their electric bill (and in my case heat in the room :)). I buy a gaming card to, well...GAME. I'd say quoting anything else is niche market given the numbers of people doing either. Which is exactly why I'm glad NV gave us a gaming card without the junk (IMHO as a gamer). No point in heating up, or wasting electricity, or wasting die space when 95% of us game on them and not much else. Your statement before about people buying a card just for a specific game makes this point precisely. Wasted die space costs more money, causes more heat (1.25v for boost, vs. .9xx for 660TI's).
http://www.anandtech.com/show/6152/amd-announces-new-radeon-hd-7950-with-boost
The effect of that voltage is shown and commented on in detail by Ryan Smith. Despite his asinine argument with me over 2560x1600 being popular for 660TI's and for Korea monitors bought off Ebay, or from some guy on Amazon with no phone# and a gmail account for returns (LOL), he usually makes good reviews. My only guess was he loves AMD or is paid. But their data has been good for a long time. Ryan's conclusions in his 660TI post was wrong. His data wasn't, which is why I used it from 3 of his own articles against him. I quoted some of those benchmarks here previously to blaz. At 1920x1200 in their 660TI review the 7950 only won ONE game 4% with 6 losses and 4 of the 6 were over 20% losses. OUCH. Yet he went on to say bandwidth was a reason (citing 2560P benchmarks) to promote 7950 and said "If we had to pick something, on a pure performance-per-dollar basis the 7950 looks good both now and in the future". But it lost 6 out of 7 benchmarks in their own tests...LOL.
http://www.anandtech.com/show/6159/the-geforce-gtx-660-ti-review/21

Now I am getting sidetracked. :) but just to be clear on tessellation how about cleve? Since you believe toms and don't read elsewhere much. :)
http://www.tomshardware.com/forum/page-3279_56_100.html
First off, I don't use maximum tessellation unless it provides a visual benefit, and doesn't cause too much of a frame rate hit. In the case of the 660 ti and the games we tested, I found it fails on both counts. Nvidia and other sites crank up tessellation, and I believe this gives the GeForce a sizable advantage in a number of tests. "
from his Cleeve 08-17-2012 at 11:08:56 AM post. Nuff said? It's right here in the comments section page 3 if viewing in the forums or page 7 if from the review I think :) Well it's in that links comments not directly here.

Skyrim has 1080p and 1600p both at x8 MSAA shown in that article. Though there isn't a variation in AA level, resolution also values memory bandwidth I believe (more pixels = more data). Again, it seems like you're comparing max and min framerates with that 50% figure of yours, which I don't really see how it would show how memory bandwidth affects a game. Let's again compare the two cards with the two settings. Comparing the min, avg, and max framerates respectively, between the two resolutions.
HD 7950: 69%; 73.2%; 74%
GTX 660Ti: 62%; 64.8%; 71%
The 660Ti takes bigger hits as you can see. See how memory bandwidth can affect performance scaling (significantly or not)? We all know how bottlenecks work.
But not in everything which is the point against his comments (nvda can't deliver etc...Those are NEVER comments). It depends on the game as we've both already noted for various instances, and more to my point (and jaquiths I think) if it can't be played there who cares? This is the only game playable at 2560x1600 (which I noted in a previous post). I totally understand bottlenecks and Hardocp showed there is none on 660TI when compared to 7950B at anything playable. You can force it into one, but it's not easy to do at anything playable on both.

About memory bandwidth having no bearing on the 660Ti in gaming, I'd have to disagree with that. Those points I shared above along with what luciferino said. Unless you're saying that the 660Ti had worse scaling for other reasons (like not having that much processing prowess with certain types of graphics workloads).
Links to where 660TI does have bandwidth issues anywhere I can run and play above 30fps besides the noted game please. Though, I'd argue anything in the 30's will dip below there during hours of play and prefer a higher number. But I can't complain about anything you can point to that says 30+ as that is the well known "magic fps number" of course I think when they said that they meant something that doesn't go below it ever. Taking your points I'd like to see something rather than a comment :) I didn't say memory has no bearing on 660TI gaming. As noted above, I can run a card at all kinds of ridiculous settings and make even a 690 seem slow probably, but it would likely be doing something not normally used to prove it. My point is anywhere it's an issue for the 660TI it is also an issue for the 7950B and if one has the issue it's likely the other isn't playable either as hardocp showed. I quoted him in the response to luciferano if memory serves. But here it is again. He said much more, just read page 8 for hardocp's conclusions.

Kyle said:
Take for example a test that stresses memory bandwidth and fills it to the brim, sure, the 7950 would win that test, but that test doesn't translate to what we just found out in real-world gaming. Performances were close between these cards, and in some cases faster on the card with the lesser bus and bandwidth. So focus on what real-world gaming tells you."
hardocp says it in their review someplace that if it didn't play on one, it didn't on the other and the same is true in reverse.


http://hardocp.com/article/2012/08/27/nvidia_geforce_gtx_660_ti_at_high_aa_settings_review/7
The Radeon HD 7950 has a 384-bit memory bus with 240GB/sec of memory bandwidth, while the GALAXY GTX 660 Ti GC has a 192-bit bus with 144GB/sec of memory bandwidth, yet we sometimes see the GALAXY GTX 660 Ti GC video card performing faster, but the 660 Ti does have a GPU clock advantage. We don't see that 66% advantage in memory bandwidth the Radeon HD 7950 has in real-world gaming with high setting AA configurations. This means there are other factors besides the width of the memory bus and the bandwidth that affect performance between these two cards; namely GPU clock and architecture advantages.
Read his conclusions on page 8 and come back to me on that again if you still desire it :)
http://hardocp.com/article/2012/08/27/nvidia_geforce_gtx_660_ti_at_high_aa_settings_review/8
When we look at the specifications between both video cards we find that the Radeon HD 7950 has a 66% memory bandwidth advantage. The Radeon HD 7950 has a 384-bit memory bus with 240GB/sec of memory bandwidth. The GALAXY GTX 660 Ti GC 3GB has a 192-bit memory bus with 144GB/sec of memory bandwidth. The Radeon HD 7950 has 66% more memory bandwidth than the GALAXY GTX 660 Ti. The AA performance results we tested though didn't show this, and in fact reinforce how memory bandwidth and bus width is not everything in gaming.

Our performance results were not too far apart in most of our gaming between the Radeon HD 7950 w/Boost and GALAXY GTX 660 Ti GC at high AA settings. We often found both video cards with fewer than 10% performance difference. We saw a few instances where the Radeon HD 7950 w/Boost was faster, and we even saw a few instances where the GALAXY GTX 660 Ti GC was faster at higher settings. So much for the extreme memory bandwidth advantage on the Radeon HD 7950.

So what does it all mean? It means that there is more to performance than just the width of the memory bus and the bandwidth available to the memory. There are other factors involved that determine gaming performance, and the widest memory bus with the most bandwidth isn't always going to be the winner in actual gaming performance.
This pretty much says it all, but you can read his whole page. There is more in there. I'm omitting the entire "THE BOTTOM LINE" section. But here's the best part of that:
[quote/]Benchmarks and the like cannot tell you this real-world information, and can be extremely misleading. Take for example a test that stresses memory bandwidth and fills it to the brim, sure, the 7950 would win that test, but that test doesn't translate to what we just found out in real-world gaming. Performances were close between these cards, and in some cases faster on the card with the lesser bus and bandwidth. So focus on what real-world gaming tells you.
Yes you can force into a situation where we can show bandwidth being an issue, but in MOST cases I'd say you're going to be looking at a resolution and fps that no person in their right mind would run at (the 14/15/17 fps as noted above...heck the 22fps also...you like dropped frames and getting killed in a shooter?).

Even if their current financial situation is bad (dug into debt and stuff) this doesn't mean that they can't use the money they have right now to be more competitive and hopefully eventually climb out of debt right?
They are burning through 600million per year and have lost 6Billion in the last 10. They have grown weaker EVERY year until giving up fabs and now the x86 race with intel. I'm not quite sure if we're looking at the same company. At this rate they will have no cash xmas 2014. 600+600+300=1.5B. 1yr + 1yr + .5yr=~xmas 2014=NO CASH. They are currently losing money on everything they sell and have been for a while taken yearly. I'm not saying this is impossible, but stating absolutely they are competitive now is...well, do the math. If I say what i think you'll tell me I'm outraged again. :)

You could've argued with blaz about if those billions are in fact enough, but just saying that they aren't doing well right now doesn't really prove that wrong. You'd have to get into the costs of things involved in being competitive to be able to argue here, but finances aren't the only factor but also the quality of the people in AMD (e.g. innovative/smart engineers). As I remember, blaz was pointing out how they aren't being competitive (in the x86 market) enough thus, why they may be in the financial state they are now. He was saying they can do better with what they have right now, but they aren't (or hopefully just have yet to).
There is no "may be in" they are in danger of going out of business soon. I mentioned Jerry Sanders (old CEO, top engineer possibly in their time, He before your time or something? Jerry Sanders was an AMD GOD! He cofounded the company, meaning he knows cpus), Dirk Meyer (do you not know who this guy is?, DEC alpha/Athlon/Thunderbird etc another AMD GOD), John Bruno (lano architect), Bob Drebin & Raja Koduri...All top notch people. All making the things that make them $$$
https://en.wikipedia.org/wiki/Jerry_Sanders_%28businessman%29
This guy's life story is quite awesome. A truly nice fellow and it is a shame we have lost this guy in cpus. This dude had style & class in spades too :)
Sanders always wanted to make money, but he realized that the key to earning wealth was for everyone else at AMD to make a lot of money too. While growing wealthy, he also lavished wealth generously on all his employees. At the end of the company's first $1 million quarter, Sanders stood by the door of the company and handed a $100 bill to every employee as they left. Every employee at the company got stock options, a huge innovation at the time.
Some rich people are GIANTS among men. Our govt has no business spreading the wealth when we have great men/women like this in our country. People attack Romney on every other news channel (dumb bias fools) over his taxes at 14%. They've been talking about this for weeks but we have a dead ambassador and 2 dead seals! But they leave out that the man gave 30% of his money to charity! He also gave his ENTIRE inheritance from his father to BYU. Effectively Romney paid 44% in taxes on his own! He set up a trust for his 5 kids worth 100million. He has donated 13.6mil to the Tyler Charitable Foundation in the last 13 years. oops...Don't even get me started on our new welfare nation. :) I truly mourned the day Jerry Sanders left AMD and that jerk from motorola took over. Jerry left Fairchild because "hogan's heroes" from motorola moved in...ROFL. Oh the irony. That's how AMD started if memory serves.
http://news.cnet.com/8301-13579_3-57476345-37/another-amd-engineer-goes-to-apple/
"At Apple, he joins AMD alumni Raja Koduri and Bob Drebin, both formerly AMD CTOs."
These are their Chief Technology Officers. Heck people are jumping from the sinking ship like rats (again, BAD, this sucks). It's also hard to get good employees when they are afraid you probably won't have a job for them for more than a few years if that. Witness this UGLY news:
http://www.theinquirer.net/inquirer/news/2104764/amd-finally-hires-ceo
Tim Cook, Mark Hurd, Pat Gelsinger (this would have been a #1 replacement), and Greg Summer ALL TURNED THE JOB DOWN. This is serious, I don't know how else to put it. Rory read?...No comment :)

I said " I don't care much about ceo's really, but losing people like Dirk Meyer (DEC/Athlon designer) and Bruno (lano etc) is a very bad thing. You lose your brains you're stuck with marketing and AMD has always sucked at that and has no money to fight an ad war anyway. Anyone believing AMD isn't in trouble hasn't read their financials."
I don't think you understand that I look FAR deeper into the company than just the balance sheet. I invest and as such have to know far more than someone commenting on a hardware forum. I STUDY THE PEOPLE with a major emphasis on the MONEY and how they spend it. You're making generalizations and I'm being VERY SPECIFIC in my criticisms of their company. That isn't bragging either. It just means I've looked at it more than you're alluding to, and already covered the "BRAIN DRAIN" at AMD. Or more accurately, I'm afraid of losing my money...ROFL. If you look at the news on AMD even 1/2 of their top brass has been leaving left and right over the last 5 years. AMD used to have the best engineers, and Intel's brute force from fabs kept them always ahead but AMD close. Now they are replacing their #1's with #2's or worse since a lot more people are fighting for the great people (qcom,arm,nvda etc etc..). AMD used to be within 10% of intel on most benchmarks and win some too. Nuff said. 😉

http://www.brightsideofnews.com/news/2012/9/17/thomas-seifert-leaves-amd2c-destination-siemens.aspx
Chief Financial Officer too! Siefert gone. How many people should I list before you take my point seriously? I've looked long before I spoke (mostly because I own NVDA - Know your enemy so to speak :)). All but Sanders/Meyer were all this year I think. Siefert last month. These people run their projects in all areas of the company. The best news AMD has been paying down the debt. But if you know Jerry Sanders his theme for years was "REAL MEN HAVE FABS". He must be cringing now that Hector Ruiz killed his. They had to sell their fab to pay down the ATI purchase that was a complete moronic use of money. Hence my comment which is regarding mismanagement of money and losing BRAINS. Note I don't think real men have to have fabs...LOL. But Jerry's reasons were sound back then, but today I think fabless and letting pros who do just fabs should do this for you (not talking Intel here, they already had them and can fund them, fill them etc). They also MAY get a decent amount of money from WIIU...But that's your IF stuff and I don't bet on that when I think consoles are dead this next round. Next xmas or so phones/tablets will surpass xbox360/ps3, and run by the new ones a year later IMHO :) (I put the IMHO in there for you army_ant7...I'm more than confident enough to say it without it though...LOL). I also think they had to make one heck of a deal to keep NVDA out. Pricing yourself to death doesn't work. Jen won't do that. He didn't fight to get into Amazon's kindle2 for this reason. Let someone else take it on the chin, he'll go with someone who actually gives more for the deal (google/microsoft etc).

He also didn't say they were in a good position right now. He just said they have the capability to compete (and hopefully they could get to that good financial position)! I'd like to add that if heterogeneous computing kicks off and the value of AMD's shares skyrocket, a lot of those investors would be very happy people. High risk, yes, but in this business, how else can you expect such big yields?
Not a fan of people who use "if" statements to argue about what we KNOW to be fact now. They are in trouble as even he noted. My beef is he said they are FINANCIALLY competitive with INTEL (heck with anyone) when all the data around you says NO they are not. They are only in business because Intel doesn't want an FTC suit on their hands. They could lower the price and put them out of business in 6 months. Intel could do this for years and not exhaust their 14Bil in the bank (that's not all, a lot more in investments). They will not need to "artificially" keep AMD in business as soon as NVDA/QCOM/ARM based chips from everyone enter the server/desktop/notebook cpu areas. Once INTC has competition from all around they can claim AMD is not their only competitor and finally drive the nail home as they will still have OTHER competitors in the market. Project Denver/Project Boulder and beyond. Not good. All of them are moving in to compete with Intel & AMD. Not too long ago Otellini said he would drive AMD into the ground. I'm fairly certain this was in regards to the next few years when he can't be caught for FTC violations. It just makes sense and they've had a pissing contest going since AMD's inception. Jerry used to go round and round with these guys... :) I miss his flare. ARM (and all the people about to enter cpu land) is bringing the ability for INTC to put AMD out of business without fear of FTC repercussions.

You act as though AMD is the only one with heterogeneous computing ambitions. I'm not quite sure why this is important unless you have some proof they are going to be in a DOMINANT position with it. Is this the case?


Again, you're making wrong comparisons to what blaz claimed. This may be why you're so outraged. He was talking about efficiency. I'm really not sure how a ratio of min to max shows any kind of efficiency and even more so a memory bandwidth-related one.
Already discussed this & I'm not outraged. Discussing something with someone, or even pointing out their wrong doesn't make me outraged. I wish we could just ban words like "raging" "outraged" etc when someone lays out a well though out opinion that happens to differ with your view. IF you could check my blood pressure I assure it its fine :) Can you see me or something? Am I swearing at you or anyone else? Having a vision of me in front of my keyboard pulling my hair out? ... :)

Your quote may not do justice for the Cleeve-jaquith-blaz thing. For instance, it seems that Cleeve called jaquith a troll because he was being unreasonable so much that it pointed to trolling. Something like this is stated in that same post from where you got that quote of Cleeve.

I remember reading through that scuffle during a time when it was still fresh, and, honestly, I don't think jaquith was totally wrong with whatever he/she was saying, though it might've been more of an attitude problem, not to mention forcing his/her own preferences.
I believe I said he made some points while not too eloquently. I also noted I'm not judging him for another 10 articles (just a framing reference, not serious here). The point being I'll make my own judgement after reading more of his stuff in the future. I clearly said, a few things are just suspicious and pointed out why regarding the benchmarks/jaquith's posts and old tom pabst info. As said, that whole affair was AWFUL and it is difficult to see benchmarks missing what I feel would be a more well rounded view of the subject with other AA's in there. But He has every right to run what he wants. I made my case for not running ref and quit. I was not trying to force it, just wondered why until blaz etc took it further (you :)). It would have died on the vine without us discussing it. I'm pretty sure I said nothing nasty in those posts either. I'm not sure why people are so THIN SKINNED. Have we reduced our country to apologizing for everything? When muslim Radicals kill our ambassador and 2 seals we apologize? Like him or not, Bush would have been dropping bombs. With Romney's connection the BB Netanyahu I believe he would do the same. You can't seem to make a comment these days, or have any kind of debate (heated or not) without making people all hate you if you're not warm and fuzzy. I love a good debate, but seriously am growing tired of having to dance around people's emotions these days.

You can call me a dumb prick that knows nothing. The best you'll likely get is "well at least you didn't attack my data". :) Personal attacks don't mean much to me. I didn't even stoop to Jarred Waltons garbage (an anandtech reviewer) when he responded to me with ahole and said I had an uninformed opinion when I tore down all of ryans arguments...ROFL. The best he got was is anand reading this...and WOW I think...LOL. I mentioned the data not being rebutted too of course 😉 But that was really already done by another user in a WALL of his own...LOL. That guy saved me hours...ROFLMAO. Ryan made it easy, it was not a debate for me to really brag about (saving you some typing...heh). You are MUCH more fun :) Aww, c'mon that made you laugh don't lie... :)


Interesting info. TH is pretty much the only site I follow to this extent. I'm not sure what may have happened back then, but I hope TH is clean today, for this is where I put my faith, along with the forum people here, like ourselves. I also go on other sites occasionally to fill in information gaps since TH isn't perfect and that's quite alright.
Made no comment on the new regime here. But you should ALWAYS go to other sites and read them all with the same fervor you do toms. :) I in fact said:
I'll read another 10 of Don's articles before I make a new judgement and stay or go again. :).
Nothing unfair based on Toms hardware's previous issues. At least I came back to make that call. :) Fervor might be the wrong word there but you know what I mean. You like tom's, I say like 15 sites :) With 5 more to fill in the details...LOL. But a large majority of my reading is about MONEY, so that's not a comment about your habits, more an excuse for mine. Though I do advocate as much as anyone can cram in their head :)

Putting out a "magical chip" at a "magical" price to be able to make a profit and survive is the idea. Though, it may not be as magical as you may think. :)
Wishful thinking, unless you know more about their current pipeline than I do. Can you expound on this? Piledriver (way late BTW), Steamroller, excavator etc...What do you mean? Did they invest 4bil in 14nm recently I don't know about that will put out this magical chip ahead of Intel's TICK TOCK haswell release etc? Do they have some leading cell phone chip coming I don't know about? They are pricing themselves to death. Tell me something that serves to prove otherwise. You can look at the 10yr financials I ran down to understand where I'm coming from. Not sure why I'm having a conversation about statements rather than data, but maybe you really do know something I'd REALLY want to know.

Debt can be, but is not absolutely/always dumb. Borrowing your capital to start a business isn't dumb. Getting a mortgage to be able to have a house isn't dumb. Having a payment plan to have a car or any other thing isn't dumb. What can be dumb is not planning responsibly on how to pay that off that debt and also forgetting that you actually have one which may be shown by your further spending actions.
Another can be what if statement. I made my claims regarding AMD and their handling of their finances vs. NVDA (sparing him the Intel rundown, they make 2x the profits AMD has in revenues...Read that again please and realize the favor I did him by not attacking his ACTUAL statement - I was being nice). I made special note about Jen Hsun's spending and not spending foolishly. Buying a home you can't afford is foolish though. Buying a business you can't afford (ati) is foolish. Buying a car with a payment beyond your means is foolish too. I make ~50K and have never had a car payment (new, 3) over $129. I paid my last one off in 3 months. I do not believe in giving someone else interest if you can just wait another year and avoid the whole thing :) That's not bragging either. :) I just chose to save and then buy rather then giving away interest. I'm baffled by people who don't own a home, but have a pair of $350 car payments and live in an apartment. A nod to your buying a home comment. Buy a home, screw the nicer car.

My quote:
Jen doesn't seem that dumb. He's highly intelligent and runs his company as if his 2/3 wealth depends on it. He has only made smart purchases well within their reach and not overextending the companies finances. Smart man. DEBT=DUMB.
You are taking me out of context. I specifically was talking about AMD in the previous sentence, then NVDA CEO spending wisely and not overextending his company like AMD did with paying 3x what they should have with ATI in this sentence. I went further and said I hoped he didn't make the same mistake as AMD. These are very specific statements and very accurate as to the situations both are in now. Jen usually waits until you are worth nothing to pay for you , like 3DFX unless there is ample reason to pay up like Icera. EVen then it was a mild purchase at ~324 million with 3.5bil in the bank. Good move, bought RF/modem/lte exp etc. Nice going. He didn't even jump at cheap AMD not long ago, I think that was a mistake but whatever...his money and not buying something isn't as dumb as overpaying for it - He knows his business better than I, he's the millionaire :) Nuff said :) I don't feel like degrading myself with another sentence after that...LOL.

AMD is essentially doing the same thing our govt. is doing when printing MONEY they do NOT have diluting our dollar. AMD's company is not worth as much as they owe. A bank who loans you money on 0% down is DUMB (again specific). This is how we got into this housing mess. If they had actually loaned money to people with 20% down we wouldn't be in this mess, but they wanted people who didn't deserve or EARN a home to have one (ah, the welfare nation at work). Banks won't give you a loan on a new business with no collateral to back up their loan. I can't go in to get a loan without having something worthy to back it up for a business venture. When your credit rating (amd=BB+ last I checked) is going down the tubes borrowing is EXPENSIVE compared to someone with AAA.
https://en.wikipedia.org/wiki/Bond_credit_rating
Take a good look at how far BB+ is down that list. You think this makes things easier? The USA has been downgraded for the 2nd time during obama's term and the 1st time this has happened since 1917. Read that again and tell me this is good news. :) Note, that was the first time we had AAA credit rating EVER. So it's the first time since getting triple A and ever. OUCH. We got it in 1917.


Like you what you said about blaz concerning finance (which I don't agree with), you yourself may need a lesson in (macro)economics.
Wow really? Here we go with condescending statements with nothing to back it up. I laid out what I said very accurately regarding AMD's position vs. NV (again being kind and leaving out much worse Intel). What part of their finances would you like to debate? Did you read any of it or just make statements because I said your friend should read a balance sheet before commenting on financial matters of a company? I didn't call him stupid. But I will say my intention was to claim his financial ignorance regarding a company's finances and balance sheets. Ignorant is not stupid. Stupid implies you can't learn. I implied he was ignorant. I believe if he sat down and started to read about finances, he has the ability to one day perhaps peck a financial fight with me and make me feel like a fool (don't know him, could be). Blaz makes mostly reasonable statements and doesn't sound like a fool. I didn't call his opinions stupid, just said they are not financially sound. If I start talking cars and "giving lessons" about it I fully expect someone to correct me beyond belief because I admit I am COMPLETELY IGNORANT ABOUT CARS. Get it? I wouldn't be the least bit defensive about someone pointing it out. I'd probably laugh and join them admitting it. It is true. I wouldn't say the person was "outraged" for pointing out my ignorance when I have no defense regarding cars. If he knows how to read a balance sheet then I am confused as to why he said that. Come forth blaz and lets talk finance regarding AMD vs. Intel. Heck we can even put in your knowledge of cpu's/apus/vidcards etc and discuss why you think they are even in this competition. I submit they are not financially or performance competitive with INTEL. With Intel investing 4Bil in R&D just in 14m (that's 2.5x what AMD has in cash) and making 12+Billion just this year, I'm interested in his thoughts. Intel makes enough money to spend 5Billion on Itanium and have it fail, yet be unaffected. They weren't overreaching when they did it, so the waste/bet means nothing. Microsoft doesn't care about losing money on Xbox's as long as they put their competition out of business doing it before bleeding out themselves :) Netscape comes to mind. Ruthless tactic but effective. Give it away until your enemy dies.

If you would like to go into a financial forum and have this debate over AMD's finances let me know when and I'll join you. I'll even wait for them to "discuss it with" you and say nothing. I'd like you to tell them you'd like to invest in AMD because they have the finances to compete with Intel. Just paste the line he said into a financial forum and watch what you get back. I would imagine you will be called ignorant no matter how they word it. Actually it may not be that nice 😉 I'd get a good laugh I think, realizing you didn't say it of course, but semi defended his stance.

You said this previously:
Sorry for seeming to be on the side of team blaz, but I just see somethings (a lot of it personal) here that needed corrections or clarifications. Sometimes, all of us could do the first sentence (and just that one) of what hasten said in his last post. Emotions can run wild and cause us to be well...irrational with our actions.
You got that first part right, but no need to apologize for it. I already see that you are on the "team". :) No need to hide it. You've been pecking at me politely (being polite about it changes nothing) but for just about nothing IMHO. What personal stuff needs correcting given what I've said? How should I word it when someone makes a statement about a company nearing bankruptcy (did you see the link I gave, 49% chance vs. 1% nv, 2% INTC etc, 1% qcom etc?) and acts like they are financially sound? Did you read all the financial stuff or just come at me to comment on how I offended him? :) Attacking my wording (or the length/formatting of my posts...as is the case with the luciferano) does not change the data either. As I told him, he can post a wall to me and I'll read it. If he's incorrect (assuming I have time...LOL) I would post as many words as I think prove my point with data to back it up. If we both have great points, well the debate is a draw I guess :)


I think a country borrowing money isn't necessarily dumb. Like with businesses, it depends on how you plan to use this money you borrowed. As I remember, there is actually a certain economic school of thought that encourages countries to borrow money. (I forgot if it was Keynesian economics.) The idea may have been that with that money, you'd improve your country (e.g. educate your people and thus have them being more productive) then you'd eventually generate more money with this investment which you could use to eventually pay off the debt, pretty much like a business.
Just discussed this, and not really in my context but... You went there so I'll expand on it: When you are borrowing money you don't have (robbing peter to pay paul etc...like taking 716B from medicare to pay for obamacare expecting to maybe make it up down the line) with 16trillion in dept (up 6T in 3.5years!) it is dumb. We can not afford it and 6 million people are about to pay a tax from it says CBO...(I think I may be one...dang). We can't afford 20mil illegals on our medicare either which is the problem here to begin with (never mind the fact it was going bankrupt already on it's own). Maybe we could afford it if we actually only paid for the people who live here legally and EARNED it. Even if you tax the rich 100% you will only fund the govt for 3 months. Do you get that? Since they pay 70% or so of the taxes (depending on what % you count as rich), who's left to pay for everything else for the rest of the year? Further, what moron would stay in a country with 100% tax? 3000 millionaires have left the country since obama took office (oops. might be just this year since obamacare started looking real...don't quote me), and we have had the highest people attempting to get canadian citizenship in ages (per canada's statements). It's dumb when you run up the debt and GET nothing back. It's dumb when you take our taxes and spend it on 500mil for solyndra (HIGH RISK) when people have already said you will guaranteed be beaten to death in batteries by china. Surprise. It happened. 500 million on Fisker (high risk) who never made a car in America is DUMB. That didn't work out either. The govt should not be spending your taxes BETTING at a casino. Private business would NEVER have spent that money as they already knew the results of both. Solar same thing. Not enough proof?
http://www.foxnews.com/on-air/your-world-cavuto/2012/09/20/stimulus-funds-used-buy-solar-panels-china
Apparently they beat the american bids? :) A little evidence they are winning on solar and a bad investment?
http://www.dailymail.co.uk/news/article-2185231/High-earners-planning-leave-France-75-tax-rate-income-1million-euros-goes-ahead.html
Watch americans run faster if they try this crap. France is about to get a wake up call. Good in my mind, maybe we'll watch them have a meltdown due to tax and stop this idea.
We clear on what i mean by DUMB now? You're right it depends on how you plan to use the money borrowed. I submit AMD buying ATI for 3x the price was dumb (which tapped them out). Which is just what I said. But you're forcing me to defend what i didn't say now. :) And I'm uninterested in debating politics with you. But the numbers don't lie. AMD is in trouble. Period. I do not need a macro economic lesson...LOL. But you may want to look at that balance sheet of theirs before you defend his position again.
http://www.zillow.com/blog/research/2012/05/24/despite-home-value-gains-underwater-homeowners-owe-1-2-trillion-more-than-homes-worth/
Ouch. Homes are not always a good purchase :) I'd venture to guess at least 30+% agree with me 😉
DUMB for us to loan them money on something with no collateral. No inspiration to hang out and pay your bill with nothing invested. Another dumb thing? It will all come back before they pay of their 30yr loans. They're bill doesn't go up if locked in even if your house is $10 value. It matters not. They have a contract for ~500-700 or whatever (insert#) and that hasn't changed because the value dropped in 1/2 etc. You just end up with bad credit and no home. Stay you can recover and not go back to apt. :)

I am wondering about GCN's tessellation performance compared to Kepler's, though as I remember from a comparison article here on TH, the HD 7970 didn't necessarily trump the GTX 680 with tessellation i.e. GTX 680 still got more frames. It just didn't experience as much of a drop in performance when it was on compared to when it was off, i.e. efficiency (I hope you don't confuse this again with a min to max framerate ratio like you did with MSAA.).
Wonder no more, I quoted cleeve above :) Hardocp spelled it out for you too, but you ignored it. Did you read through my post just looking for things you could pick at, or actually soak up what I said? Hardocp says it in every article on these cards when radeons dip. You read the numbers, and bothered to do the math (again to pick?) but didn't bother to read the conclusions in the articles (many I pointed to) or anything else it seems. You seem to be picking what you're reading...I'm not raging either, I'm just saying...Again, can't really blame you for not reading it all. But at least don't complain before doing it. The complaint rings kind of hollow then. :)

You can't really make an insightful comment about what's in a book without reading it correct?
You said:
I'm not sure where blaz said the GTX 680 sucks. I honestly didn't look through the whole thread
Hence my your on "the team" comment verifying what you said yourself regarding this or at least making me wonder :) I'm wondering why you comment at all without having some sort of motive. We were not having some nasty fight and didn't look like we needed a referee or something :) I will fully read your comment, and comments related to it to figure out what you said and why before I say something refuting it. I have no leg to stand on if I don't (much like luciferano). My comment had nothing to do with a 660TI beating a 670 or anything related to it...LOL. I never said a 660TI is faster than 670 clock for clock etc in anything or anything remotely related to this. My comments were all related to 600's not being able to run MSAA, Nosedive comments and 7000's are better. He wasn't reading the posts. Is he on the team to? How do I join? :)

It seems like we have different definitions of "real-world performance," because, as far as I'm concerned, "real-world performance" refers to performance data derived from any application actually used by people, games and productivity apps for example. Now, some people might actually derive fun from certain synthetic benchmarks, just like they do games (I did enjoy walking around in Heaven Benchmark. Don't judge me!!! :lol:). For those people, would anyone think I'm wrong when I say that the performance results for those synthetic benchmarks pretty much become "real-world performance" numbers for them in essence? After all, what is a game or entertainment application other than a tool for fun?

If you don't consider timed demos or repeatedly ran scripts in specific scenarios in games (what I would call "real-world" benchmarks) to be able to indicate "real-world information," then what can? I'd have to contest you there since even though these benchmarks only show performance under a certain scenario in a game, they are data of that specific scenario nevertheless, and they do tend to indicate the overall performance of the rest of the game, depending on how these benchmarks are made. What else could we base our purchasing decisions on (one of the main, if not the main reason these benchmarks are made)? When you were talking about it in your post above, you gave benchmarks that were made to saturate memory bandwidth. That sounded like a synthetic benchmark.[quote/]
The 4GB vs. 2GB? Hardocp HIGH AA super tests? That's to prove a point, like I said. Not to prove how you'll run necessarily. He was pushing them to areas you can't play in, just like Hilbert. But hardocp did use real world stuff, I'm just saying anything in the 2560 (skyrim aside) wasn't meant for anything other than to prove MSAA wasn't an issue. Most of Hardocp's comments were Bandwidth related. He was after proving this just like Hilbert was trying to find a case for 4GB over 2GB. It's informative nonetheless, but he wasn't making statements about one being better than the other there and flatly points out after each bench at that high res, that neither ran there playable. Which may have also been his point. Maybe he read anandtech's analysis...LOL. His 2008/2009 article I pointed to mentions their pissing contest :) He quotes Derrick defending canned tests...LOL.

I'm not sure what you consider a "real-world" test. Playing through a whole game at certain settings with a certain graphics card and seeing how it performed for example? You could consider that to be one big timed demo in itself if you look at it. It can't be helped that people end up not doing the exact same thing a timed demo demonstrates, but having a sample is a good thing to have compared to nothing. For one thing, comparison between components would probably inaccurate to a certain extent because someone not running a scripted demo would probably end up doing things differently each test run thus likely resulting in different test figures.
I thought I laid that out in my last post. Again, I gave links to hardocp's "Benchmarking the Benchmarks" article and another from 2009 there. You didn't read that before attacking either. Jeez dude, at least read the stuff you attack before joining the "team" please. J/k, I saw yours come in as mine went up...LOL. I doubt you got to it. You asked before what I meant and I told you with links to explain what I meant. Ever heard of Office Bench? That's real world. Yeah, I agree with some of your apps stuff but it has to be like what we'd do in the real world or things are kind of slanted. Did you read Van Smith from vanshardware I pointed to? The sysmark stuff? NO...Again you didn't or you'd get it. I said quite a bit about bapco and how that all went down. WOW. This comes right back to running the cards BELOW what they are shipped at. What for? Do you regularly buy cards and want to know performance in a situation you will NEVER be in? My car analogy was spot on. :) You know anyone going home with a $300 card to purposely slow themselves down? This is akin to what I mean by NOT REAL WORLD. AS in we will not do that unless you're trying to defeat something like I am (I said I wouldn't, only DownClock cpu) with the heat issues. It's not real-world when Intel basically owns Bapco's sysmark. How fair is that going to be to AMD. OfficeBench was born, and I think kind of killed by Intel...LOL.
http://exo-blog.blogspot.com/2010/03/officebench-decade-of-pc-benchmarking.html
Quick google there for ya...They were the thing 10-12yrs ago, or about to be but I think someone nixed it or paid him to go away quietly or kept him off sites via ad money etc (much like news ignores anything obama does but attacks the crap out of romney :)). It's sort of dead it seems now. It represented what we really did in apps, and Intel wasn't happy about that back then. They were cheating :) Got caught, dumped Bapco's domain registration, fixed the land issues etc...LOL. They were using apps that flagged the cpus and wouldn't turn on stuff on AMD's chips. Took forever to patch it too to recognize SSE/SSE2 if memory serves in portions of the benchmarks. Knowing this they conspired to repeat the functions AMD wasn't being SEEN in despite they were right there ready to be used. Intel....er I mean Bapco (hehe) kept prolonging the patch to keep AMD down in the benchmarks. Nasty, all over a freaking flag identifying the cpu and slanting the benchmarks. This is hardocp's point I think to some degree. Intel threatened everyone back then, including Asus etc. My asus p2b back then came in a white box and had no name on the board...LOL Thunderbird/athlon was kicking their butts and they had to threaten people until they became competitive again :) I hope you get where I'm coming from with this and the links to hardocp's story. You can probably still google all the benchmark shenanigans. I remember anand didn't use the patch for a while (but discussed it minorly), pcmag/extremetech refused because it wasn't legit (amd released it on their own to fix the flag, but sites refused because bapco hadn't officially put it in forever...LOL. WHAT?). No bias there...If you believe that I have this bridge... 😉


I'm sorry, but you're either lying or mistaken with implying that you didn't attack blaz. A lot of things you said above were insults to him and his abilities. I wouldn't take this against you, but saying that you didn't is another story.
Explain. I didn't call him stupid. Just ignorant in a round-about way :) Where did I insult him? I'm quite sure I said he was wrong and factually incorrect many times. But he better get thicker skin if that's the roughest thing you can find 😉

I don't think he would be done saying that stuff as they do seem to be credible. Again, take note of his statement. He's saying that the 7000 series cards take less of a hit compared to the 600 series cards when turning on/increasing (MS)AA due to memory bandwidth differences.
He didn't just say that. Again, read what I quoted him saying. Jeez, just CTRL-F blaz if you can't bother to read it all. Next, next, next... 😉 You will see my quotes of his with timestamps. I refuse to quote it for the 4th time (one to him, one to you pointed out his EXACT quote, and one to luciferano same thing with timestamps). How many times do I need to quote the guy? What for when you won't bother to read it, no matter how many times or apparently who I say it too? :)

Again, you've been getting percentages derived from the min and max framerates of certain game settings, when you should've been getting percentages from framerates of games when AA was on and off. With that in mind, I wouldn't say he was wrong for saying that the GTX 660Ti and less so, the GTX 670, have a weakness when it comes to (MS)AA. I would consider it a relative weakness that they aren't as efficient with MSAA as the competition, but that one relative weakness doesn't mean they perform worse at certain settings when it comes down to it. Maybe he could've been more specific as to say "MSAA" and you would be right to reprimand him for that if what he said didn't apply to SSAA, FXAA, etc.

If blaz prefers the HD 7950 compared to any cards Nvidia offers, he is entitled to his own preference. He didn't tell us not to get Nvidia cards. He said "I...would not want anything from the Geforce GTX 600 series for a higher end gaming system..." This thing he said may be more questionable "I really don't like playing at 1080p without some serious AA and Nvidia simply doesn't deliver in that," but again, he is entitled to his own opinion which may have been influenced by his choice of games and settings. If the 7000 series performs better for his needs, then maybe that's why "Nvidia simply doesn't deliver" for him. What "delivers" or not is subjective. Though again, that last statement of his is debatable and it's quite fine that you spoke up against it.
He claimed something that wasn't true as shown in the benchmarks. I didn't tell him what to buy. It's a free country. I showed ample evidence and statements from reviewers rebutting his statements. I clearly pointed out it's not correct to state 600's tank in anything where 7000's don't do the same (at a PLAYABLE setting). I do not care what he buys, just that he tells others that it's a mistake to buy based on bandwidth (he made statements about this/rops etc) and MSAA performance. Hardocp shows this is NOT an issue or they would win more benchmarks (anand/hardocp both show it winning handily at any playable res for both). 1 win, 6 losses 1920x1200 at anand. I quoted hardocp enough you should get their opinion. "Wiped the floor", is pretty clear in the case of the 680 vs. superclocked 7970 - can't happen if all 600's are weak and 7000's strong correct? I quoted it to blaz, ctrl-f wipe and you'll find it. Or better yet just read hardocp yourself, I could be lying as you insinuate I've done already :) Wouldn't want to snow you, or have you take my word for it. 😉 That's kind of the point of all the links. Nobody has to take my word for it. I am hoping they read it and don't bother attacking me for nothing repeatedly 😉


You seemed very offensive and also overconfident with your statements throughout this (and other) post(s) of yours . Remember, that whenever anyone does what you did along with bragging, they just open up a way for them to eat their words (later on). That's why I try to be safe with what I say. I try not being absolute i.e. being open to other people's thoughts thus acknowledging that I can be wrong, which is only normal for humans like us. Humility and tactfulness are virtues for a reason. :)

Sorry, but you got sidetracked a lot and made/implied false accusations. If you're gonna try to refute someone, you have to work against what they said and just focus on that. You've obviously put a lot of effort/work/energy in these posts, but no matter how much you do put into it, if you miss your target or aim in the wrong direction, it would be to no avail.

I'm not saying every single one of your points were wrong. A lot were factual. The ones that didn't seem so might've been mostly your claims against blaz. Anyway, they might've been factual and useful, but most of it may have been irrelevant to actually proving how the guy was wrong...
Point to things I said that are false please. He made statements categorically (please read the 3 quotes). I refuted them and backed them up and I didn't even say either sucks, just that NEITHER does where playable. Which hardocp very clearly points out. False accusations? Implied? If I say what I'm actually thinking it seems you're offended. Just how do I have a conversation with someone in your mind? He gave a "lesson" to the OP but I'm different? Cleeve just about did the same (well, I'd say he did with the troll comment to jaquith, but it matters not I'm not discussing cleeve's comments here). They both had their opinions and points (it's up to who's reading them to determine that I guess).

Where did I brag and about what? It is not bragging If I say bob/jack etc is ignorant. It's not even wrong if it's proven true. I never once said I was a financial or computer genius either. I may have alluded to the fact that I know enough to not make a statement about financials without actually "digging deeper", but that should go without saying :) I may know enough to believe I can comment on it, but alone that's not bragging. Pointing out I know how to fix a car is not bragging about it. If I tell you I'm the best car fixer in the world, well then you've got me :) If you are not confident in your opinion I'd suggest you keep it to yourself. Start putting question marks at the end of your sentences when making statements and people will assume you are unsure of yourself after reading your statements with upward inflection. :) I prefer not to look weak when I believe I'm right until proven wrong. At which point I'll acquiesce which is what it looks like blazorthon has done. I can't see anything in his arguments that say AMD is financially sound with proof. He commented to another poster without anything to back it up. Unfortunately financial comments are easily proven or disproved. It's not hard to read a balance sheet, I'm nothing special among millions who do. I went over the game data with links ad nauseum 😉 Some people are probably about to vomit over the financial talk :) But it was regarding something everyone should want to know about. I care about AMD (or at least their IP) staying alive. Heck if they go out of business tomorrow (not likely) you'll see me buy a cpu and video card that day...LOL. I would expect prices to go through the roof on both and would advise all of my friends to do the same and everyone else I know. Intel/Nvidia does not deserve to be able to steal from us, and that is what happens when competition sucks. Witness Intel having no GOOD (IMHO :)) cpu's below $200. By that I mean i5/i7 quad (why buy anything else :)). Intel can keep them above this because AMD hasn't been competitive in a long while. Intel can slow down shipping new cpus because AMD isn't pushing them.


I hope you don't start hating me for this. I don't like "losing" people who have been nice to me. :)

(Whew! This took me a few hours to write as it seems. Well, that goes to show that I don't really have anything better to do, but I still think it's worth it as long as at least one person gets to read it. :))

ROFL. Let's see, you said I made a bunch of implied or false accusations without anything showing that (that I could see, just a statement or two about it), insinuated I'm a liar (pretty much says I am...heh), said I bragged, I'm overly confident, I'm dumb and need a lesson in macro economics (despite clearly knowing quite a bit about money and stocks/financials), I'm "outraged" etc...LOL. I could go through and find more but what's the point. Sure sounds like exactly what you said about me eh? Nah, not offended. I just know where you stand now :) I'm just reading bits of your two new posts...You're doing it again...ROFL...Don't rage now...hehe. How did you put it:
That's why I try to be safe with what I say
I'm thinking you're saying I'm not?... :)
Remember, that whenever anyone does what you did along with bragging, they just open up a way for them to eat their words (later on)
Make sure you don't do what I did...ROFL. Wow. I haven't eaten a word yet AFAIK or can tell. You've made a lot of insinuations, opinions etc about how I should talk, etc but not much of any of this has much to do with the data. Your Financial comments, pretty much just that. Comments or opinions correct? I need a lesson but you gave none. I gave a TON of info regarding a company going bankrupt and you start talking loans :) You critique my luciferano posts but he attacked my format and length before I said anything. What should he expect back? Am I supposed to cower in fear, acquiesce for no reason? He then went into things this was NOT about. I never said a 660TI is better than 670 which he went on about. Who's talking about that? I'm supposed to be the better man I guess. What for? No data he can give will change he's talking about something we were not discussing and hasn't even read before making statements about the "quality" of my formatting and length of the post. That will at least draw a comment and usually much more nasty than I gave. After what you've done to nearly every one of my posts (and another two while I typed this) I see no point is worrying about people's feelings. I thought we were all less sensitive or you would have been more care "safe" with what you said and not stoop to my level (did I get that right?). You just called me a troll...LOL. Seem is the same thing :) Childish too...The insults keep coming...ROFL. People need to lighten up and grow thicker skin :)

I took debate class and had wars with best friends. It mattered not. We had a beer 20 minutes later...LOL. Well, I had shots but you get the idea. One of the best friends I've ever known is just about the most staunch democrat I've ever met (I am not :)). He helped run campaigns in oregon and his dad was in govt. We managed to coexist, play on the same teams
 
[citation][nom]somebodyspecial[/nom]Not quite sure you even understand what this argument is about. I never said 660TI scales better than 670 or 680. They will perform in that order. I'm not even sure it's worth answering this as your whole premise is based on something that isn't really happening The 680 is faster than a 670, which is faster than 660TI...I'm confused.HE said:That's all NV cards he's talking vs 7000's (all amd cards). NV does not take a nosedive as every game I showed is clearly showing. If it nosedived I'd expect all victories for 7950Boost vs. the 660TI. You really didn't read my stuff or his did you?HE said:He's saying the 600 series performs less than the 7000 series radeons AGAIN.HE said:Seriously? That's the ENTIRE 600's series he has dogged vs. just the 7950 at 1080P (1920x1080). This is not a discussion about the 660TI not scaling as well as a 670. I thought we all knew that already. OOB the 660 will lose to the 670. The 670 will lose to 680. But that isn't what this is about. It's about 600's TANKING vs. 7950 (or heck even the rest). You don't even understand what is being talked about. You clearly didn't read my post. The bottom of my post to army_ant7 has all 3 of these quotes with times posted. I'm pretty sure army and Blaz both understand exactly what I'm pointing out, and there is nothing to argue about here. It is WRONG. You didn't even read my post to you...LOL.I said this to you:"There is no game in their tests where the 7000 series is playable and the 600 series is not. Just click the links."Which really is what this is about as noted from his quotes. Read the conclusion page at hardocp as I requested you to do. It's page 8. It CLEARLY states memory NEVER affected the 660TI despite the memory differences vs. 7950Boost. It also clearly states they never hit a "VRAM WALL". Kyle even points it out. KYLE said:The 2GB 660TI NEVER HIT A WALL even in EXTREME AA settings. Never mind the 3GB, as he points that out too. NO WALL, NO advantage for 7950B vs. EITHER 2GB or 3GB 660TI. Read my post to Army. Nothing changed with 7970 clocked at 1280 vs. 680, Kyle said in his superclocked comparison that it SWEPT the floor with AMD's cards even against the REF model.Also read this article about 4GB at guru3d http://www.guru3d.com/articles_pag [...] ew,18.htmlPalit 2GB 680 vs. Palit 4GB 680...He tested this at highest settings, the fun starts at this page 18 in DX11 stuff specifically set up to show 4GB beats a 2GB...LOL. Nothing happened. Despite Hilbert's efforts to find a reason to spend the extra $90 he failed.From the article: Hilbert is WELL known for his excellent reviews with in depth analysis attempting to prove a specific situation or issue or product quality. His motherboard reviews are quite awesome getting right down to the types of capacitors, vrm's etc, chosen chips and why they are good or bad...I love his reviews. Here's a sample:"The Maximus V Extreme comes with 10k Black Metallic capacitors, offer a 5x longer lifespan with 10k hours at 105C, and 20% better low temperature endurance - specifically selected for extreme cooling scenarios. The overall choice on components is the best of the best really. But let's zoom in a little." He proceeds to "zoom in" so to speak... He usually gives FAR more detail than most would want. Which is incidentally, my favorite part about reading his articles even when I disagree. I love his details.Another quote from Hilbert in conclusion page 26 Nothing happened, and he can't even justify it on this page, even saying (and he's just guessing), it MAY make a difference in triple monitors in 3D (again just guessing, because NOTHING he can run proved any difference up to maxed out 2560x1600). Also note the 4GB ALWAYS lost to the 2GB...ROFLMAO. IN EVERYTHING. Even though the argument isn't about this, I still proved you wrong. You can't even get your incorrect argument correct. Jeez. He is talking maybe making a difference in a hypothetical 6mp resolution. Hardocp did an article in 5760x1200 also, proving 680 wiped the floor with 7970 even at this res, so not sure it matters. Are you having trouble reading the parts that are PLAYABLE? AT no point in 1920x1080 in the article you are quoting (my AA article - Pages 2-4), does the 670 beat the 660 by more than 16.2%. It's not 15-25%. You are exaggerating. The only time it gets above 20 is at UNPLAYABLE settings on BOTH cards. Is there any point in saying X beats Y when they both are running 15-17fps? Seriously? I don't care if you can show a difference at 5760x1200. Of course I can force 2 cards into a situation where I can finally show one beating the other. If the 660TI is running 1fps and the 670 is running 2fps you've proven the 670 is 2x faster here. But IT ISN'T PLAYABLE SO I DON'T CARE.max payne, batman AC, skyrim, battlefield 3. NONE are more than 16.2% different at playable settings. So don't quote 15-25% when it is USELESS to run where you can prove your case. Again, this thing isn't even about this, but you can't be bothered to figure that out, or even make a useful case with what you did say.Umm...Are we reading the same hardocp site? You just said a JUNK 7950 can beat the BEST 660TI? While kyle keys on avg performance he also shows the minimums. But I defy you to show me a JUNK 7950 beating the best (or heck any OC 660TI) 660TI at hardocp. PAGE 8, KYLE'S CONCLUSION:"Our performance results were not too far apart in most of our gaming between the Radeon HD 7950 w/Boost and GALAXY GTX 660 Ti GC at high AA settings. We often found both video cards with fewer than 10% performance difference. We saw a few instances where the Radeon HD 7950 w/Boost was faster, and we even saw a few instances where the GALAXY GTX 660 Ti GC was faster at higher settings. So much for the extreme memory bandwidth advantage on the Radeon HD 7950."Another quote from the same page:"Benchmarks and the like cannot tell you this real-world information, and can be extremely misleading. Take for example a test that stresses memory bandwidth and fills it to the brim, sure, the 7950 would win that test, but that test doesn't translate to what we just found out in real-world gaming. Performances were close between these cards, and in some cases faster on the card with the lesser bus and bandwidth. So focus on what real-world gaming tells you."Both statements say pretty much 1/2 went to the 7950 BOOST (by definition this isn't a JUNK card it is AMD's BOOSTED edition) and 1/2 went to the (kyle's words) "card with the lesser bus and bandwidth."...LOL. Umm, you're wrong. Again, Not even sure why I'm having to discuss something that isn't even what this entire conversation between me, blaz, and Army is about, but you're wrong at every turn. I'm not even going to bother responding to you again. You are not even arguing about our conversation. You're having a conversation with some "other" me who I'm not aware of... Is my post pretty enough for you this time? It doesn't change a thing but isn't it formatted pretty? I digress...Please don't waste my time again. Good that you didn't argue about the AMD can compete with Intel that Blaz said...LOL. I will gladly BURY anyone who wants to tell me AMD can financially compete with Intel as I have already done even vs. Nvidia. Inserting INTC into that commentary makes my statements ridiculously friendly to Blaz's misrepresentation of AMD's finances. INTC's profits AND sales AND market cap are 11x NVDA. I still can't believe he said that. Are we done yet? I was never offensive to anyone, unless you don't like data and long posts (I see you don't). If I can't say someone is WRONG and PROVE IT, without being offensive, I'm not sure what to do or how to word it to make you happy. You attacked my formatting in the previous post (as usual when someone can't argue with the data) and put words in my mouth as if I'd said he committed a crime for editing. I merely explained why it's difficult to quote someone when their posts are changing. This was in defense of me not coming up with what Army_any7 wanted...Though I found stuff that gave him exactly what he wanted anyway - precise quotes where he unanimously declared 600 series is bad and tanks vs. 7000 series. I hope this is easy enough for you to follow and you don't start misquoting me again (or leaving out blaz's REAL claims, thus misquoting him too). If you find it offensive I've proven YOU & blaz wrong, well, that's your opinion about ME and you're entitled to it I guess. The devil's in the details eh? Love the handle. I tried to get nobodyspecial but it was taken...LOL...Have a good night.[/citation]

Actually, it is you who didn't understand what blazorthon's argument was about. He talked about scaling, how AMD had superior scaling to Nvidia, and how the GTX 670 had superior scaling to the GTX 660 Ti. He wasn't entirely accurate about the reasoning of it, but he was still correct about it nonetheless in that MSAA and according to your link, also CSAA, do scale better in efficiency with higher memory bandwidth. Your links prove this as does his own.

Your links, as well as blazorthons, show that Nvidia does in fact take a "nose dive" in scaling compared to AMD, although that is not what he said anyway. He said that AMD has better scaling with higher resolutions and with higher levels of AA and can have better performance with AA. So far, SSAA is the only apparent exception in this. No, FXAA doesn't count because it's not a high level of AA even when it's maxed out and really, as others have said previously, it's more of a blur effect than actually AA anyway. Tom's and other sites have proven again and again that AMD has better scaling with their Radeon 7xxx cards compared to Nvidia with their GTX 6xx cards and depending on the situation, AMD can take quite the lead in such situations.

It's also true that the majority of GTX 660 Ti reviews used outdated AMD drivers to make AMD look far worse than they actually are. Would you like a head count of this? I could do one for you if you like.

Not once did I say that 2GiB is not enough memory for these tests, nor did I say anything about the cards hitting memory walls. Not even once. I said that 1.5GiB of memory is not enough and the HardOCP review shows this quite extensively. The GTX 660 Ti's 2GiB of memory is not organized in the same way as the GTX 670's 2GiB of memory. All three memory controllers have 512MiB of memory together and this memory runs, altogether, at about 144GB/s (not accounting for memory controller efficiency that brings it down to probably around 100-120GB/s). But wait, this card has 2GiB. Where's the other 512GiB of memory?

A single memory controller has that last 512MiB (one controller has 1GiB whereas the other two have 512MiB each), so that last 512MiB runs at a mere one third of the bandwidth as the rest of the memory. That is why the GTX 60 Ti 3GiB was able to pull closer to the GTX 670 than the GTX 660 Ti 2GiB. The 3GiB model has all of its memory running at full performance because all controllers have the same amount of memory, 1GiB. Basically, the 2GiB modle has 1.5GiB running in triple channel whereas that last 512GiB runs in single channel and this had an effect on the performance that is made obvious through the tests with the 3GiB model. I said this in my previous post too and I'd appreciate it if you'd listen instead of mocking me and making me have to repeat myself.

Also, here are a few other links from HardOCP on GTX 660 Ti reviews:
http://www.hardocp.com/article/2012/08/27/nvidia_geforce_gtx_660_ti_at_high_aa_settings_review/6
7950 with Boost has a slight win overall in apples to apples comparisons. Whether or not both cards are unplayable regardless of this in many tests doesn't change this and whether or not the 660 Ti was only a little behind in most of its losses doesn't change this.

http://www.hardocp.com/article/2012/09/03/msi_gtx_660_ti_power_edition_oc_video_card_review/5
Then there's this. We get a non-reference 660 Ti with a very considerable factory overclock, a Radeon 7870 with a similarly great factory overclock, and a reference 670 and 7950. Yes, that's very helpful, comparing factory overclocked Nvidia card to a reference card in the same price point, a reference Nvidia card at a higher price point, and a factory overclocked AMD card at a lower price point. Seriously, what's the use of a comparison that refuses to even use proper comparisons? Heck, the 7870 used was faster than the 7950! Even worse than that, we have the apples to apples comparisons at different resolutions for different games.

The tests used were clearly chosen specifically to emphasize Nvidia and were inconsistently chosen for real apples to apples comparisons. If someone will test something at 1080p or 1920x1200 in one game for apples to apples comparisons, why wouldn't they do so for every game? Failing to do this makes this review useless for pretty much all readers. Any 1080p or 1920x1200 gamers would be stuck with only some games being tested properly and the same is true for 2560x1440 and 2560x1600 gamers. This review is nearly useless. The above review, although using many unplayable tests, tests each and every game in both 1080p and 2560x1600.

This means that for the games that the reviews tested, both 1080p gamers and 2560x1600 gamers can use the review for all games tested. That review had the goal of testing the impact of heavy MSAA in this situation and it did those tests with the two cards that were chosen more or less impartially. It would have been even better if it included playable comparisons with lower settings in the tests that were unplayable (but at the same resolution) so people knew what to expect in real-world performance, but still, it succeeded in its goals for the few games that were tested. However, since so few games were tested, it is still not good for making generalizations across the majority of modern games.

Now this is the review that I was talking about earlier:
http://www.hardocp.com/article/2012/08/23/galaxy_gtx_660_ti_gc_oc_vs_670_hd_7950/1
My one problem with it is that it doesn't properly overclock the GTX 670 TOP which should have had a much higher memory frequency than a mere 1.585GHz frequency. Something more like a little over 1.7GHz shouldn't have been a big deal. Regardless, the GTX 660 Ti 3GiB (a Galaxy 3GiB, probably the very same model used in the other reviews with a Galaxy GTX 660 Ti 3GiB) was properly overclocked and so was the poor-quality Radeon 7950 used in this test (a mere XFX model). The 670 still beat the 660 Ti 3GiB and lost to the 7950 most of the time. So, anyone who says the memory bandwidth doesn't matter is not well-versed enough with the subject. Anyone who says that not using an overclocked AMD card against an overclocked Nvidia card when you want to use an overclocked Nvidia card in a comparison is also being biased.

All of your links clearly show that memory bandwidth is a factor in performance and so do pretty much everyone else's. blazorthon was no exception in this.
 
Also, since an AMD CPU such as the FX-8120 and the FX-8150 quite truly can be used to compete with the i5s and i7s in gaming performance (yes, even against the K editions with overclock versus overclock comparisons), AMD obviously can compete with their current budgets and is improving too. To compete with the i5s is as simple as disabling two cores and overclocking the CPU/NB frequency to 2.8GHz or 3GHz.
Competing with the i7s is a little more complex, but still reasonable, with using PSCheck to drop the P states of the second core of each module significantly (no more than a 45-70% drop is needed, but how much you drop depends on how much you want to focus on lightly threaded performance and how much you want to focus on highly threaded performance) and to prioritize for the now primary core of each module to let the second act more like a virtual core rather than the physical core that it is. This lets it use the front end more intelligently so that the two cores of each module do not interfere with each other and waste a lot of power.
 


You're mistaken, I never called anyone a troll for pointing anything out. You can point out things all day, feel free to do so. That doesn't make you right, doesn't make me right either. But that's not important.

I called him a troll because he repeatedly suggested bias and impropriety instead of arguing a point.




I didn't run any other modes because I didn't have infinite time. What we produce as always down to the wire. If I had weeks to do the review I'd have added many more modes. We're not given the luxury, and I prefer to add more games than modes. And I feel the modes we used are valid because the methodology is valid.

You obviously feel otherwise, but every game I tested was quite playable on the 660 with the AA settings I used. I was there, I saw them. As I stated earlier, I chose the settings based on what the 660 could handle and went from there. Some games have no AA, some have 4x AA, some have 8x AA. If I had a hate on NV, and knew ahead of time that the 660 had a problem with 8x MSAA, then why didn't I apply 8x MSAA to all benches? Why didn't I use the global illumination setting in DiRT Showdown, something GeForces have a big problem with? Clearly, there's no anti-Geforce agenda here.

And frankly, saying "I'm not willing to say someone has a bias yet" is, actually, suggesting someone has a bias in a slightly underhanded way. 😉

I'm really happy with how this review was carried out. I'll perform future reviews with the same procedure, too. And if our method isn't cookie-cutter with other sites (and what some folks feel comfortable with), I don't have a problem with that. I'm happy to serve the readers who appreciate another valid perspective.


 
First just to get this out of the way, and note this is the 4th time I've posted his exact words. So I'm putting it right up here at the top so you can't miss it.

"Different settings can affect performance greatly on some card while not so much on others. For example, using heavy MSAA doesn't hurt AMD's Radeon 7000 performance much whereas Nvidia takes a nose-dive when it's used. "
blazorthon 09-16-2012 at 11:25:18 AM "
He's very clear here. Nvidia (all cards, there is NO distinction) take a nosedive and 7000's don't. For this to be correct doesn't he have to win in the benchmarks in question? He's got an absolute statement there. Sorry. I mean if you make a comment like this surely the 7000's should beat the 600's correct? Or what is the point?

"I'd also still be worried about the 660 Ti when you play with some good AA such as 4x MSAA and 8x MSAA. Even the 670 shows weakness in this, so I have no doubt that the 660 Ti would still do poorly in it. Even 1080p can show the 670 and especially the 660 Ti waning significantly in heavy AA while the Radeons with GCN GPUs just keep chugging along at about 85%-95% of the frame rates that they had without AA."
blazorthon 09-22-2012 at 02:26:48 PM
So the 660TI and now 670 poorly ins 4x/8x MSAA. He has no doubt. They don't keep chugging along...LOL. But they lose or tie in 3 out of 4 benchmarks at hardocp in 1080P (heck doesn't change in 1600P either). I think 14 is smaller than 16 correct? Again, he's talking absolutes (didn't you say people eat their words later after doing that?...Shake your finger at him not me next time...LOL). "670 shows weakness" leaves no room for doubt and he's using that to justify his comment about 660ti. Waning significantly is an absolute statement that they drop and radeons don't. How much does this matter if you don't win the benchmark? In order for me to actually claim something is better doesn't it have to literally be better by definition? If we are talking about one card being better (or multiple cards here) than another card at 1080P (or cards 600's vs 7000's whatever) shouldn't it actually be able to WIN at 1080P? Shouldn't it be above playable at 1080P? You could argue the 2nd point there, but I'm pretty sure you need to show victory to claim you're better at anything. It kind of goes without saying he should be able to show 7000 series (any card he wasn't specific GCN's is ALL of them based on 7000's, but we'll pretend he said 7950B) winning at least 1/2 the benchmarks at 1080P with MSAA on right?

Again Blaze says (check that stamp):
"I really like how the 7950 turns out in this and would not want anything from the Geforce GTX 600 series for a higher end gaming system, especially at 1080p. I really don't like playing at 1080p without some serious AA and Nvidia simply doesn't deliver in that. "
"Message edited by blazorthon on 09-22-2012 at 02:28:31 PM"
If that isn't an absolute unequivocal statement saying ALL 600's suck for gaming at 1920x1080 in HIGH END gaming then I do not know how to read english and apologize...LOL. "ANYTHING". Did you catch that? No 600's for a high end gaming system. He didn't really mince words here did he? Is it ok to say you & he's wrong if you both are? Do I have to reword it to not offend all 3 of you? 😉 Give the defense a rest please...You are not helping yourself here.

Army_ant7 You said:
A "nosedive" is a nosedive, but that doesn't mean the GTX 600 cards nosedived passed the HD 7000 cards in performance. blaz didn't say that.
He said that he had no doubt the GTX 660Ti would do poorly with those settings. Where did he say it would do poorly though? Maybe he was just referring to that chart he shared?
He just said poorly correct? for all 600's right? If something is performing poorly I'm thinking by definition the other must beat it correct? In order to make this statement valid the 600's must lose to the 7000's, or at least the 660TI must lose to the 7950/B. I'd say the whole 600 series must lose because he say he wouldn't want ANYTHING from the 600 series. Correct? That's rhetorical 😉 I mean, if he's NOT saying that, he would purposely take home a loser. That doesn't make sense. He has to win or this whole thing just doesn't make any sense. I'll give him a break though and not use his absolute words, which you apparently still think he didn't say, against him. I'll pretend he just said 660TI at 1080P sucks for gaming :)

Point me to the nose-dives on 1080P. Don't bother if all you have is toms. Data please, statements don't do us much good here without the data. Mind you now, that has to be showing the 660TI takes a dive and the 7950/B doesn't and 7950 actually WINS. Hardocp says they're even but no nosedives. Good luck. If it's so easy to prove, it shouldn't take long to prove it outside toms. I'd half expect you to point out multiple sites because he was so clear about the 600's sucking vs. 7000's as noted above (he didn't mince words...LOL). I'd hope he has mountains of this evidence with such convictions and absolution. He seems pretty, how did you say it "overly confident"? :) I sincerely hope he reads more than toms since as you said, you don't and only use others to fill in the details when needed. :) I had a hard time even finding a site that compared the drop from 2560x1600 to 1920x1080 which isn't what he said obvously about with and without AA...LOL. And then, it's unplayable doing it in the games it lost anyway.
Here's the hardocp numbers again:
max payne3, 15 vs 12 1080p hardocp.
http://hardocp.com/images/articles/1346060093OmTdS2Q4x3_6_4.gif
Don't even get me started on the point kyle makes, which is neither card runs here...LOL.

Battlefield 3 It wins by 3fps but...We are only 1 for 2 here. Also note it loses the avg and max to 660TI, so while he wins in my view (I judge minimum) you could argue he's wrong again.
http://hardocp.com/images/articles/1346060093OmTdS2Q4x3_6_6.gif

Batman AC at hardocp?
Doesn't the 7950B have to win and not lose 28 to 22fps for his argument to work?
http://hardocp.com/images/articles/1346060093OmTdS2Q4x3_7_3.gif
Well it lost, so again moot. I'll give you it took a bigger dive (if we can actually tell that from 1600P to 1080P - which isn't what he's saying), but if it still wins and by 22% I'm not sure he can claim 600's can't do 1080P, let alone the 660TI itself which is proven totally and unequivocally incorrect. Yes, I just used an absolute. The second you can prove 22 is greater than 28, I'll acquiesce here 😉

Lest we leave one off and you claim I'm ignoring his only win...hehe.Skyrim... he did score a victory finally with the 660TI scoring 50fps...Technically 4 times he's wrong in 4 games.
http://hardocp.com/images/articles/1346060093OmTdS2Q4x3_7_5.gif
Kyle sums it up best but I think we're done saying one tanks when the other doesn't (in light of the facts presented here). Since it's just comparing the increases from dropping from 2560x1600 to 1920x180 even hardocp doesn't give exactly what he's saying with/without AA (like I said before I tried finding exactly what he said, and couldn't outside of toms single chart and only in this game where it wins already) but if it loses loses 2 out of 4 this discussion is over correct?. More to the point in 2 out of 4 (and I'd say 3 when you're in the low 30's and will dip into 30's at some point, these are small snapshots in time) games are below 22fps at 1080P (both he loses in) on the 7950B. So since it loses in two games and even in those two it is the worst in MSAA this makes his absolute 1080p statements junk data doesn't it? From his statements it should soundly kick the crap out of ANY 600 series card, but that's right, I'm pretending he wasn't so absolute and said 660TI. Still, same message right? For the 660TI to suck it should NOT win 1/2 the games, and 7950 should at least be playable and certainly not beaten by more than 7950 beat it in its two victories. The 7950 gets beat WORSE by the 66TI in both of its wins vs. 7950 wins. ergo the 660TI handles MSAA at 1080P better correct? Note I'm not trying to make this argument here, just pointing out what the data shows. We've already discussed ad nausem that we can show victories on either side (heck, I just did that here). When it wins the two it's 10% (55vs. 50, and 34 vs 31...10%). When it loses it's ~12% and 22%. Those are bigger numbers. I stand by my statement he is absolutely incorrect. To be right at all the 660TI can't win, and certainly not by larger margins. This even gets worse if I go by avg fps, as he loses battlefield 3 then also :) OUCH. Not looking so good for the no doubt about it champ 7000's.

For skyrim, the only game he truly wins in (min max and avg), lets not forget this little nugget :)
Important FACT: Once again both video cards are more than playable at this setting, sitting well above the 60 FPS line.

Summary

The Radeon HD 7950 has a 384-bit memory bus with 240GB/sec of memory bandwidth, while the GALAXY GTX 660 Ti GC has a 192-bit bus with 144GB/sec of memory bandwidth, yet we sometimes see the GALAXY GTX 660 Ti GC video card performing faster, but the 660 Ti does have a GPU clock advantage. We don't see that 66% advantage in memory bandwidth the Radeon HD 7950 has in real-world gaming with high setting AA configurations. This means there are other factors besides the width of the memory bus and the bandwidth that affect performance between these two cards; namely GPU clock and architecture advantages.
So, well...I don't think I need to even say anything about his conclusion. If the 660TI performs faster even once it's pretty over correct? He was absolute in saying the 600 series tanks, 660ti tanks and 7950 doesn't etc...Maybe "the team" thought people would just keep getting lost in our walls? :) You quoted these statements I've made here in your response to me responding to luciferano before so you were seeing them...LOL. You can tell me anything about how I talk, my "nettiquette" is bad etc,....Heck call me the devil. It matters not. Show me your case since my case is pretty clear here. He said it and is wrong in all cases. I said wrong at every turn with luciferano because he was. So was blaz, and you for defending this. I can't sugarcoat that for your feelings after repeatedly hashing this out with you. I'll say it again as you say "just for clarification" (insinuating I wasn't clear...again with attacks? What the heck can I say that is OK? You can't "school" me on netiquette while saying the same stuff - which doesn't bother me a bit, get thicker skin people, I'm just pointing it out since people seem so stuck on this crap)...I hope he's going to stop saying that 600 series tanks and 7000's don't. Or even that they keep chugging along... We can't show DIRECTLY AA with and without here as you said (only super high vs. super high vs. another res). But you can't tell me this will help him. The drop from 1600P to 1080P is larger vs. 660TI's in cases (disproving a bandwidth problem on 660TI which is the ENTIRE point of Hardocp's article here...IT's all about disproving this bandwidth issue, keying on the 660TI for that reason vs. the 670 and 7950B). As noted before I can find something to show we can force it here, but it won't be reality as hardocp clearly states in gameplay. Again not helping the dominant 7000's over 600's blanket statement. You can say he's 50% wrong, (heck insert any number) but you can't say he's right when blanket statements like he said 600's vs. 7000's. If I say X is ALWAYS faster than Y and all X sucks at blah blah, then if you are proven even ONCE that this can't be the case in gameplay, I'm pretty sure your blanket just lit on fire 😉 Taking in his 3 statements it's pretty clear what he said any way you slice it.

I can sit here all night to try to please your sensitivities or just state the facts and move on. It seems nearly impossible to word something stating people are wrong without offending someone, or someone else reading far more than they should into the statement regarding netiquette. But it's not going to change what i said DATA wise. Which is the only thing I care about anyway 😉

Here we go with more games people play: oops...Offending someone I'm sure 😉
What you consider underclocked and overclocked may be different from what other people consider them to be, because last time I checked, any card sold with clocks higher than the reference clock is "factory overclocked," and manufacturers have to do things like have chips binned good enough, possibly have to increase voltages, provide warranty for a card that may wear out faster.
You didn't even read the first sentence:
I didn't ask for anything over OOBE, just OOBE but the exaggeration on my part was merely to make the point; accuracy of the downclock matters not. You did miss the point, we're talking about a DOWNCLOCK, not overclocking. If I was asking them to find the max then bench it there, that would be completely different. That would be STRESSING the card.
Not quite sure why we're discussing what I consider over or under. I clearly stated I don't want reference when they don't ship at reference, over and over to you and him in our conversation that brought me to this forum again. Last time I checked people ran their cards at whatever it comes at out of the box unless overclocking (more, or to begin with matters not), and manufacturers don't care what I consider over or underclocking. Are you trying to say people run them slower than what they buy them at (like I said, unless you're trying to beat a hot room like me, who the heck does this?)? I'm not sure why you are missing that I was asking to benchmark OUT OF THE BOX, but you start making statements about what I think vs. what others think is over or under? Because, OOB is what it comes at. You're on "the team" alright...LOL. Nitpicking at things I didn't say, when I was very clear, I have no way else to see this type of stuff. Does it really matter what is considered overclocking or underclocking here? Does it matter that the manufacturer has to do anything to get a card to do whatever? My argument was nobody takes cards home and purposefully downclocks them from what they pop out of the box at (most don't even know what OC or UC is - of course people reading here do, but not in the real world where tom/dick/jane buy whatever's on the shelf and run home with it). What part of that do you not get? The can bin it, paint it pink, sprinkle magic faery dust on it etc. Once you get it home you don't SLOW it down do you? The rest of this stuff is just picking at me for nothing and attempting to piss me off. :) None of your statements have anything to do with what I said. This is just like luciferano. We are not talking about the same things. OOB doesn't change no matter what someone does, says etc. IF the manufacture ships it at X speed the user takes it home and runs it at that speed or overclocks it. That has nothing to do with what a manufacturer cares about, what I think or what anyone else thinks. I asked that it runs OUT OF THE BOX (BTW, those are caps not bold :)). You can talk to me about what anyone thinks about over or underclocking all day. But what does that have to do with asking someone to run it as it's shipped no matter what it is shipped at? My point (in light of NOT seeing the article YOU pointed me to that I thanked you for), was I'd like to see the performance as it ships without anything said about over or underclocks and why you may do that after you get it home. Again this wasn't regarding these cards either, it was articles as a whole in the future etc...The practice of changing out of the box.


Are you claiming that if one manufacturer puts out a card that comes OOB with a high clock, every other card in the market suddenly turns underclocked? If you claim that the normal clock is the most common clock of the cards found on the market, what if stock shifts and and cards with a higher clock become the most common, do the lower clocked cards suddenly become underclocked?
Nope. I'm claiming you shouldn't be slowing them down when NOBODY does that #1. Let me say it again. Take it out of the box and leave it alone #2. Is that simple enough for you? Trolling now? :) By your definition isn't that what you're doing? Don't care what it runs at. Don't care if one is slow, another is fast, another is superclocked, another is super-downclocked. It matters not. Just don't change them to make them run slower on purpose. They didn't ship it like that did they? Make sense? It's comic that you even started this and can't understand the point. But then I do not believe you are this dumb (team stuff?). I think you fully understand what I said. But nice try 😉


Anyway, based on what you were saying, blaz was just suggesting that your analogy may have been better worded the other way around and with different figures. You said that you wanted to see the cars being ran at 200+mph, which is pretty much their max speed (i.e. limit) if you'd look at the two figures you brought up 202 and 211mph, and see if they would handle well, if the engine blows up, etc.. If we were to put this in terms of cards, this analogy would imply that you want TH to overclock the cards (close) to their limit and see their characteristics (like temperature, durability, and stability in general). That's why your analogy may have been false. blaz gave more moderate figures like 65mph and 55mph which don't need to be accurate since it's just an analogy (The way you took these figures against him, it seems like you were the one playing with semantics, if not, something similar in nature.). What a something is "intended" to run at is subjective. For one thing, I don't think a car is intended to run at the max speed or anywhere close all the time. I could imagine it having a part or multiple fail sooner than expected if that's the case, and that doesn't seem to jive with the word "intended." Anyway, if a graphics card that comes factory overclocked is intended to run at that clock setting, then that moderate number, 65mph, sounds like reasonable comparison to me. The card could probably be overclocked more (including overvolting) up to a certain point just like how the car can be run faster, but the manufacturer of the card "intended" for it to run at a certain speed, which is proven by the warranty they provide for that speed. I'm don't think cars have any specific speeds that they are "intended" to be run at, but with that point of mine above, I doubt it's their max speed. The 55mph figure seems analogous to how TH underclocked the factory overclocked card. blaz's analogy seems more sound than yours did, and even though we did get what you meant even without that analogy, him suggesting a better analogy wasn't just a matter of semantics I would say.

If you can't overclock a car, then why use cars in an analogy to this topic of ours? I would say you're the one who missed that point of this article. It's not about specific models (or what they're clocked at) but about the GTX 650 and 660 (at reference settings of course since they are called "reference" for a reason). What you should take note of is that overclocking is just increasing the max clock of a card (increasing how much it can be utilized by pretty much), if a certain model of car can be bought with a V6 engine and then you modify it by replacing it with a V8 engine, then maybe you can compare that to overclocking. If the manufacturer already offers a V8 version then you may consider that as being factory overclocked.
The cars are downclocked (so to speak) at 200mph. You need to read it again. I said they are NOT overclocked at all as their max speeds are BOTH over 200mph. I didn't ask for them to run them over their speed (say 250mph...didn't ask for that, which obviously mods to the cars could get over this making this NOT the max, much like a truly changed from reference card set for OCing which changed fans, IC's etc). Look up the definition of semantics. I purposely picked cars that were faster than 200mph so I would be correct 😉 Nobody calls their friend and says his new 211mph car runs great at 40 (insert any number you want below 211). "I can run it really great at below it's shipping speed"...ok man. Got it. I said the same thing twice (cars and cards). That is semantics. You say potato, I say pototo. Correcting me just means you know how to spell. It doesn't change that we're talking about a potato 😉 You missed the point. I said OOB. ergo...wait for it...change nothing, run as shipped. You can spin that statement all night if you'd like. :)

You missed that I didn't care about these cards, and was talking about many articles in the future not the 650 etc in the review here at toms, hopefully tested out of the box (also missed I said I didn't see the new article with all of the cards compared because it came a month late and I left as noted below). LOL. Here's the point about commenting when you haven't read the whole thread no? And it's actually a disussion with YOU and Blaz days before. IF you haven't read what I said, what is the point in commenting? I even thanked you...LOL. Read it...It was a response to YOU pointing out the link a few days ago...ROFL
Thanks for pointing it out (see, the ref stuff got me ignoring them for a time 😉). I didn't notice that until I'd posted my other msg...LOL. I've actually been a little busy and just missed it I guess, but I wouldn't have went away so quickly if I had that data at launch :)

Thanks again for the replies guys. Have a good one (as I see the sun is now up...rofl).
09-20-2012 at 10:16:29 AM
You really don't want me to be nice to you do you? :) Yet you said I was so nice to you. Correct, I thanked you for pointing me to an article that gave me what I was REALLY after to begin with...ROFL. But you keep trying to offend me with this stuff and all the finger shaking you did at me...LOL. This is getting old. We actually already had this conversation so to speak. Not sure how I can say that more politely. I'm pretty sure by your definition I should be offended by now many times. So please no more netiquette talk, lets get back to the data huh? The last two paragraphs of my last response to you laid out all the stuff you said or insinuated about me, not that it matters I'm not so sensitive (can't believe the country has been reduced to being worried about a conversation minefield but I digress). Nuff said about you shaking fingers at me :)

Luciferano said my posts were "ridiculously long" and badly "formatted and hard to follow". Whatever you get back after that is well deserved by your definition. I even apologized for the walls before hitting SUBMIT in those posts...LOL. This is akin to Obama/Clinton running $70,000 ads apologizing for some dumb trailer made 6 months ago, when they killed our ambassador, 2 Navy seals and an aide. If you've killed our people, what right do you have to ask us to apologize for what our country is built on? FREEDOM OF SPEECH AND RELIGION ETC. I should be careful about offending him after his comment? By your words he should have just said "could you the quote system so it's easier to follow please". Which would have gotten a "sorry I was in a hurry, I'll try to fix that next time". If someone punches you in the nose, and you punch them back, I wouldn't be coming after you for doing it...LOL. Japan Bombed Pearl harbor Dec 7th 1941. We kicked the crap out of them with Abombs they were rushing to kill us with (we just won the race). But we pay them for our atrocities...ROFL. Jeez. But it was apparently completely ok to kill our people first?...LOL. Getting the point here?

I would say the speed of car isn't analogous to a card's clock, but more so to its performance (framerates). You can drive/utilize a car under its max speed and you can run/utilize a card below what its current clock offers.
I would say you're wasting my time 😉 But whatever :)

Sorry, I just had to give my opinion on this... :)
What for? To troll again? It would seem this is when (if I was cleeve) I'd be calling you a troll no? How did you put it, "being unreasonable"?
Your quote may not do justice for the Cleeve-jaquith-blaz thing. For instance, it seems that Cleeve called jaquith a troll because he was being unreasonable so much that it pointed to trolling. Something like this is stated in that same post from where you got that quote of Cleeve.

I remember reading through that scuffle during a time when it was still fresh, and, honestly, I don't think jaquith was totally wrong with whatever he/she was saying, though it might've been more of an attitude problem, not to mention forcing his/her own preferences.

Interesting info. TH is pretty much the only site I follow to this extent. I'm not sure what may have happened back then, but I hope TH is clean today, for this is where I put my faith
Ouch single sourced info from a site with nefarious history as shown (no comment against NEW owners here, or cleeve as I said before)...But maybe that's just a time issue...Though from all your posts to me...I'd say your time is more wisely spent gaining more data from MANY other sources (insert fav 1/2 dozen sites here).
Well, that goes to show that I don't really have anything better to do

Not quite far enough on his team yet? He sending you PM's? I'm just saying... 😉 Should I be calling you a troll yet? Really, you may want to find something "better" to do since you seem bored. You said you had nothing better to do. I can see it now; forgive me for not paying more attention to your comment. I'm starting to feel like I'm feeding the troll here. Are you drinking on your day off or what? I could at least attempt to forget all this if you were 3 sheets to the wind. Everyone does crazy stuff on twitter occasionally... But otherwise...Wow. Scratch that, I have no accounts on any facebook/twitter etc and never will so I'm correcting myself here...LOL. :)

I'm not saying every single one of your points were wrong. A lot were factual. The ones that didn't seem so might've been mostly your claims against blaz. Anyway, they might've been factual and useful, but most of it may have been irrelevant to actually proving how the guy was wrong...
Did I prove he was wrong this time (after repeating what I've already said again and again). It's shorter and not poorly formatted 😉 Anything here irrelevant to the statements made against me or irrelevant proof? 😉

All of the these quotes from you here: army_ant7 09-26-2012 at 06:51:12 AM
It was 6am and it seems a you had the day off (much like me), so maybe 3 sheets to the wind? Not that I'm offended anyway, just trying to excuse you.
Hasten did say this, semi backing off (then attacked again anyway...LOL)
I may have been a bit intoxicated when I made that post as well!
hasten 09-22-2012 at 03:00:14 PM

This is why I asked :) But his was mid-day...heh. Yours is more understandable at 7am and possibly up all night having fun in a forum. Since it took you hours you may have been pretty toasted around say 1-3am. 😉

And to use your words:
I hope you don't start hating me for this. I don't like "losing" people who have been nice to me. :)
But then, I'm not sure if that last part is true with your comments to me...But I'm trying to be nice to you here still 😉 I'll note, I haven't had a SHOT, or this might have been much different (again a nod to an excuse for you) :) Not that I really care (as stated not easily offended at all), but others might wonder what is up with the comments from you etc, and you may want to rectify what they think :) Me? I'll be ready for another lively debate next week with you :) It was fun for the most part and I have no anger towards you. No worries here :) I have company arriving a day early today (ugh! and more Friday) so I have to bow out now for a while :) Well, rephrase...I'll have a "short one only" hopefully in our next issue if one ever comes up :) These posts together took me hours too. I don't believe either of us can say we're much smarter for it unfortunately. But hopefully a few got some good sites and more info, as you alluded to, from it than us and can use it to make a more informed purchase in the future. :) I love lurking in financial/tech forums just for that. Whether they have a lively debate or not, I get lots of links to good info. I really don't care if one person calls another anything etc - the info is still good to me and am more than happy usually to just run away with it... :) I only came in here and asked a question here regarding the policy of testing at reference in reviews. I never expected it to turn into a few days of defending myself. I may just go back to lurking. While fun to some degree, all of this (no offense) was a waste of my time. The only thing I can say I've gotten from this is the link to Tom's article that I was really after as noted above in my thanks to you originally. WOW. That's really is sad given the volume of exchanges.
 


{SNIPPED - he said much more than you're saying above and I didn't see any links from blaz - just a pic}

Don't have time to pick apart your statements. See my post responding to army_ant7 about what blaz REALLY said and why I picked it apart, jeez it's the same quotes I pointed out to you already. I know it was about scaling. I know bandwidth plays a role in performance. But as Hardocp says, it's not the only factor and shouldn't be discussed as such. Blaz clearly said he wouldn't want a 600 series card over 7000's with MSAA/1080P as why (scaling or not, that wasn't the point...He wouldn't take it no matter what). But I proved the 7950B doesn't beat it anyway (let alone the entire 600 series as he said) in high AA/1080P.

Show me some benches in a game where the radeon 7950B douses the entire 600 series making them not worth buying in MSAA (Blaz said he wouldn't take a 600 series over 7000 for 1080P high end gaming with AA - that is NOT a scaling comment, its a X is always better than Y in Z situation). You can't at anything playable. To push bandwidth to the point of hurting the 660TI like he said (and like hardocp disproved, which was the point of his high AA tests), will end up with both sides being under 30fps. Hardocp's point? Read his summations after each game. Where one tanks, they both are below 30fps (making that test useless for either side to claim victory correct?). Where they run fine, they both are fine. ERGO - worry about real world and 30fps not so much the bandwidth and finding a weakness that really doesn't exist in light of this. Either you didn't read anything I wrote, or are just ignoring it as I never claimed bandwidth means nothing. Just that you have to push these cards to places we wouldn't play at to show it in these particular cards (but he used it for all cards). Where it became an issue in hardocp's results, you couldn't play on EITHER card. I'm done with you until you can show 600's getting doused by the 7000's (he actually said 7950's in that statement vs. the entire series, and you said a JUNK 7950...LOL which even army_any7 couldn't defend - I'd expect YOU to show a DEFAULT CLOCKED 7950vs 660TI OOB superclocked to the max - you said JUNK 7950 BEATS BEST 660TI) in a situation where one is better than the other side at 30+fps. Don't point to something where it doesn't win everything either. I'd expect with his statements I shouldn't be able to point out a 600 series victory at all in 1080P at high AA. Or that kind of negates the comments he made, scaling or not.
Regardless, HardOCP has already shown that even junk 7950s such as the XFX models can generally beat the best GTX 660 Ti models in average gaming performance unless you really hate high levels of AA (obviously excluding SSAA and since no amount of FXAA is a high amount of AA, also excluding FXAA).
Nuff said. If you don't hate high AA levels heck show it in SSAA if you can, I'm sure jaquith wants to see this too which was his point against toms ONLY picking MSAA (I'm pretty sure he was saying you might lose in SSAA and FXAA and don't even have TXAA) :) Again though, we're talking 1300mhz or so on the BEST 660TI vs. a DEFAULT CLOCKED 7950 (Reference is junk correct?). You at least better show it beating a superclocked out of the box 660TI or you're pretty wrong here correct? I'd argue about avg vs. min with you but what's the point? Suffice to say, MIN matters when becoming unplayable slide shows, max or avg doesn't. You can run 100fps beating me all day while I run 50fps, but if you spend half the day at 20fps, and mine rides 40-50fps all day, I'll take my card every day. I will never get shot in the head by you because I didn't know where you were due to the dropped frames. However I'll routinely pop up behind you (in front too,but you get the point I hope) from "out of nowhere" so to speak and just keep popping you in the head. Pro gamers understand minimums are important. I'm not one but I get the argument and have had it happen too me...LOL. I promptly turned stuff down until never dropping below 30fps as a few newbs kept getting that "invisible hit" despite my skills...LOL. I'm not sure why hardocp chooses to key on avg, but they always point out IF they are UNPLAYABLE there anyway so it's ok either way for them.
 
You're mistaken, I never called anyone a troll for pointing anything out. You can point out things all day, feel free to do so. That doesn't make you right, doesn't make me right either. But that's not important.

I called him a troll because he repeatedly suggested bias and impropriety instead of arguing a point.
That's unimportant too. I brought it up to show a personal attack by army's definition. As mentioned jaquith (etc?) was not eloquent (thus deserving it? - Kind of also my point...:)), and I stated you both had points.


I didn't run any other modes because I didn't have infinite time. If I had weeks to do the review I'd have added many more modes. We're not given the luxury, and I prefer to add more games than modes. And I feel the modes we used are valid because the methodology is valid.

You obviously feel otherwise, but every game I tested was quite playable on the 660 with the AA settings I used. I was there, I saw them. As I stated earlier, I chose the settings based on what the 660 could handle and went from there. Some games have no AA, some have 4x AA, some have 8x AA. If I had a hate on NV, and knew ahead of time that the 660 had a problem with 8x MSAA, then why didn't I apply 8x MSAA to all benches? Why didn't I use the global illumination setting in DiRT Showdown, something GeForces have a big problem with? Clearly, there's no anti-Geforce agenda here.
My comments weren't about this article really (I said I had no interest in the cards here, I didn't even read it, just checked to see if they ran ref again and quit right there after checking test setup page), but a response to blaz's 660TI pic really. Even my questions when coming here were regarding the practice in all reviews not this one particularly and before I saw the article Army pointed out, which gave me what I wanted as I said in my thanks to him for pointing it out. My only comment then was I wished it came first (again a comment on the 660TI review, not this one), rather than a month later. But that's just an opinion and says nothing about you. I didn't "troll" on about it as I saw both points and just read elsewhere for the rest of the picture (already had from launch articles on 660TI), realizing that you don't have time to do everything as you said. Nobody does, which is my reason for getting the whole picture from many reviews. I'm not aware of making comments about if they were playable or not regarding your article (pretty sure I didn't since I didn't even read your 650/660 article at all). If I did I'd be more than happy to apologize :) If I accused you of an anti-Geforce Agenda in this article, again, I'd be more than happy to apologize. :) Maybe you're directing those at someone who called you out on this article?

We could argue about Dirt Showdown, but that would be opinion anyway. The game is rated so poorly at metacritic I'd rather see old D3 but I never brought that up either. Most of the reviews say don't buy, get Dirt 3 instead if you have neither. But I never mentioned showdown unless I'm really forgetting something I said (I'm not young anymore...ROFL). I never mentioned anything in your review specifically, just wanted another mode (in future articles, really) and cards to be tested OOB in the future (but you already did it in the article army mentioned, which I didn't see at the time). Had I seen that article at launch when I last was here I'd have never even made the comment. My bad for not checking if you'd done the other a month later. But I noted it was what I was after anyway, and I didn't comment on that article either here. I gave thanks to army_ant7 with no comment about the actual article that I'm aware of.


And frankly, saying "I'm not willing to say someone has a bias yet" is, actually, suggesting someone has a bias in a slightly underhanded way. 😉
Not sure I agree when I was pretty clear about saying it was HIS comment, and I'll read 10 of your reviews myself before I believe it. I made it quite clear about saying it was based on stuff before bestofmedia even bought this site. Also only mentioned TOM PABST himself as really being involved vs. Van Smith. No other name was mentioned. I'm assuming this was before your time, as your name is not in the memory I have of then, but I did leave for a while and back then tom wrote most of the stuff himself so perhaps my memory is fuzzy on when you came in, but I never said you had anything to do with it or even alluded to you being involved in ANY way shape or form. I didn't think you were here then (still don't, but correct me if you were). To be honest, I think TOM was the only one involved as he ran the whole show back then pretty much top to bottom. Anyone who says I come to x site, read their reviews and just blindly believe everything they say, is frankly, foolish. That's why I ready a dozen or more on any part I care about. :) If you try that with a politician you'll quickly have no money in your wallet :) They can spend whatever is in it better than you...I promise...ROFL. /sarcasm
I think I can get away with saying that...They are all getting money from lobbyists, including my favorite people :) The benchmark shenanigans happened ~2000-2003 or so if memory serves. Just saying someone should read a lot to form decent opinion doesn't mean anyone cheats either IMHO, just that it's good practice to find out via multiple sources before thinking anything just in case (I think this should just apply to life really...Or you'll eventually be taken for a ride/scammed).


I'm really happy with how this review was carried out. I'll perform future reviews with the same procedure, too. And if our method isn't cookie-cutter with other sites (and what some folks feel comfortable with), I don't have a problem with that. I'm happy to serve the readers who appreciate another valid perspective.
You missed the part where I said you both had points, and he didn't say what he said eloquently I guess. That's basically saying it's at worst a draw, and more pro you than him I think. :) But like I said, I don't think I said anything about your review in either case (650 or 660TI), just wanted another mode and OOB. Which is totally up to your preference, hence nothing more was said to you. Maybe that stuff is all directed at some other poster also. I didn't read to see if someone was nasty to you or something. It wasn't pertinent to my discussion with Army etc...But it wouldn't surprise me in a forum...LOL. Sorry I didn't get back to this earlier to clarify.
 


To me it's a very, very important distinction.




A great strategy. There's always more than one valid perspective. I learn a lot from other reviewers, too, there's never enough time to test everything and it's interesting to see their approaches. :)
 
Status
Not open for further replies.