GeForce GTX 295 Performance: Previewed

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

cleeve

Illustrious
[citation][nom]TheGreatGrapeApe[/nom]Yeah I feel that to, there are still some in-depth looks, but they are rarer and the site seems more geared towards marketing and selling than investigating and learning.[/citation]

As a writer, I can confirm that is *not* our mandate.

This is a 'preview' guys. Nothing more, nothing less, and admittedly referring to pre-release hardware.

There's nothing wrong with having a look at the upcoming stuff and reporting on it; that's what the readership is here for.

Now if it's after launch and the cards are not available in retail, they're more expensive, whatever - we'll be reporting on that, too.

It's not so long ago I was just a reader of Tom's and not a contributor, but I can't imagine having a problem with a 'preview' article that doesn't claim to be anything more than that. I enjoy reading a preview article before the product is released to get a sense of where it'll be sitting in the scheme of things; you obviously had enough interest to bite as well, so are you suggesting we shouldn't have published it at all?
 
Cleeve answered your other spew, I'll quickly address this one.

[citation][nom]silicondoc[/nom]I keep hearing this fanboy argument about ATI making more money per card because of GPU size differences. But then the reds never seem to want to mention the expense of GDDR5 memory. What's up with that small gpu guys ?[/citation]

GDDR5 is more expensive, but only slightly, and the more expensive GDDR5 is offset by the more expensive and complex PCB required to service the GTX's larger bit-width memory interface, so the trade-off is the same there. And from everything we know of enclosure/HSF design/cost, the GTX's big shroud and large copper sink costs more too. So it's doubtful that the GDDR3 chip saving on the GTXs is making up for the other costs associated with their design.

When discussing products like this one in the preview, you have a 2 PCB product (GTX295) competing against a single PCB product, which costs more depends alot on the complexity of making the X2 board, but I doubt the numbers would heavily favour the immature new 2 PCB solution right now during the potential price war. It might slightly favour the GTX295, but even that I would doubt at this point.

Now remember ATi and nVidia also primarily make their money from the chips, where the AIBs are stuck with the board costs based on the IHV's reference designs.

So if ATi is able to get 2-3 chips for every chip nVidia can get from each 300mm wafer, then it's much cheaper for ATi to bring their part to the AIB, and their minimum cost stays much lower. With 65nm the yield was well north of 2:1 which means that for ATi themselves selling X2s against GTX280s was still cheaper on a chip level, and nVidia getting these 55nm GTX295 chips out at even an improved exact 2:1 level (unlikely with a mature RV770 vs new GT 55nm chip) you would still end up with the cost per solution being 2:1 at the chip level.

As for your price gouging comments, if anything ATi under-priced the HD4Ks forgoing potential profits that they easily could've made before the financial crunch, and which really hurt both companies bottom lines. nVidia wasn't against price-gouging, they wanted the higher price at first, but then they realized they were faced with a capable but much cheaper competitor. Their only option was to drop prices, even if it meant losing money for every chip/card they sold, it was still better than losing more from not selling cards. Now that they've got 55nm their major focus will not be the GTX295, it'll be making the other GTXs cheaper to help stem the bleeding. The GTX295 is a Halo product, meant for PR purposes, not for profits.

The difference being that for ATi, with their lower chip costs, the X2 might actually be close to profitable, and at least more so than many previous multi-GPU solutions. And that's what the small-GPU idea is about.

Mega chips are great for money-no-object consumers, but for cost conscious IHVs and price sensitive consumers, the small chip solution is a better fit as shown by the HD4K vs GTX comparison sofar. You might sell a large number of mega chips for mega prices, but it'll be tough to compete against small chip solutions that can occupy more segments for cheaper. Companies live of the low side of the high end, the mid range, and the high side of the entry level, this is where smaller is better. The big chip market is a loss-leader that you want to attract people to your profit makers. nV was pretty vocal about this during their GF 6 & 7 series, only recently has it become about the mega-chips because that's what they have now.

So I don't see why you're against small chips, other that which IHV currently uses them, but the strategy is showing obvious benefit to both the mfr and consumer.
 
As for Killer-Deal, yeah I don't see it compared to others (like I mentioned the sub $200 HD4870/GTX260 cards [even $220 GTX-216]), but sure, it might be like CF/SLi, never be consensus. I think it's a good value (especially compared to previous products) but a Killer deal to me is just something I never would expect, and that would be a $180 HD4870 level card, when that was the price of the mid-range last year, a $500 top of the line card (one chip or two [be it HD3870X2 or GX2]) just doesn't scream 'Killer Deal' to me, recent GX2 prices were more 'killer' to me (where other factors could be overlooked because they were so cheap).

[citation][nom]Cleeve[/nom]..Now if it's after launch and the cards are not available in retail, they're more expensive, whatever - we'll be reporting on that, too.[/citation]

Unfortunately after the effect the IHV/PR departments wanted are done, and even a retraction and erasing the (p)review wouldn't correct things, anymore than previous paper launches and missed prices would in the past where those rebates gave temporary competition for one or the other into the other's launch products. I have yet to see a single review go back and say "oh we made a mistake on this months ago, we've fixed it now" or to send another review that says "follow up on our previous review, we were mislead/mistaken/overly-optimistic, etc." As an example of this the review by Tino about the low-impact of drivers is still up, meanwhile both nVidia and ATi have launched big driver boosts. But those mistakes had little PR/marketing advantage or play, neither company benefited from timing, if anything both were hurt by it.

[citation][nom]Cleeve[/nom]I enjoy reading a preview article before the product is released to get a sense of where it'll be sitting in the scheme of things; you obviously had enough interest to bite as well, so are you suggesting we shouldn't have published it at all?[/citation]

I enjoy reading preview articles that talk about the tech, but I'm skeptical of pre-prices and pre-performances in any such review.
Am I suggesting that this preview shouldn't have been published at all, yes, as it is now, yes, that's what I'm suggesting. Would I say there should be none, no, but the timing and the content just don't seem right. I would've prefered no mention of potential power ratings (especially when pulled from the system, not the actual card) that can't be posted let alone confirmed, and I would've preferred no mandate. If nV wants to give THG a peak under their petticoat it should be without conditions and with the pre-established set of traditional THG benchmarks that have been used in the previous GTX and HD4K reviews, no filter no preference.

I like reading about hardware whenever I can, but this was a little too gift wrapped and planned for me. And it's not like this is a new position for me, you should know better than most, that I'm not a fan of giving alot of credence to early leaks other than as a "oh interesting possibility" factor, because alot of things change between those early previews and the final clocks/drivers/etc. Why you think I would give more credence to something that has an IHV's blessing I don't know. It's like your Avivo vs PV reviews if either of them gave you pre-launch non-WHQL drivers to test with. The timing just ads to that, not as a suspicion that it's a fake or that there's bias, just that it's more being used as a tool for marketing purposes than as any kind of information sharing.

I don't trust any IHV so this is no exception. Personally I would've looked at ANY review here, and you know that as well, so it's not just that this reviews content/title that got me to bite, it's the content that got me to reply/post (after missing the initial launch while doing the Xray thing during my holidays). Personally I would've preferred the first S3 FULL in-depth REVIEW (with hopefully more details than the lame 3Dmark laden things out there now) from THG than this preview.
Or I'd prefer you (specifically) look at the GPGPU acceleration of all contenders in the various NLE and other products getting the CUDA, OGL, OCL buzz lately. But that's just me I guess, and it seems I an people like me are in the minority these days, which I'm fine with, but I'll still hope and push for more, like I always have, even when it was Lars writing the reviews.
 

silicondoc

Distinguished
Feb 7, 2008
82
0
18,630
[citation][nom]TheGreatGrapeApe[/nom]Cleeve answered your other spew, I'll quickly address this one.
1. GDDR5 is more expensive, but only slightly, So it's doubtful that the GDDR3 chip saving on the GTXs is making up for the other costs associated with their design
2. As for your price gouging comments, if anything ATi under-priced the HD4Ks forgoing potential profits that they easily could've made before the financial crunch, and which really hurt both companies bottom lines( BUT ATI IS CHEAPER TO MAKE SO THEY MADE MONEY ANYWAY - WHILE YOUR MORE EXPENSIVE TO MAKE HATED NVIDIA DIDN'T - SO WHOM WAS GOUGING ?). nVidia wasn't against price-gouging, they wanted the higher price at first( FOR THEIR MORE EXPENSIVE PRODUCTION - NOW THEY LOSE MONEY, RIIGHT ? SO WERE THEY GOUGING, OR JUST BARELY GETTING BY AT HIGHER PRICES? YOU'RE OBVIOUSLY INSANE.), The GTX295 is a Halo product, meant for PR purposes, not for profits.( SO THEY DIDN'T GOUGE, OR BARELY MADE A DIME ON IT)The difference being that for ATi, with their lower chip costs, the X2 might actually be close to profitable, and at least more so than many previous multi-GPU solutions.( YOU DON'T KNOW, BUT IT SURE SOUNDS GOOD THE WAY YOU PUT IT- "{NV GOUGES BUT LOSES MONEY, ATI IS PRICE FRIENDLY BUT MAKES MONEY") The big chip market

3.So I don't see why you're against small chips, other that which IHV currently uses them, but the strategy is showing obvious benefit to both the mfr and consumer.[/citation]

No Cleeve didn't answer anything, Cleve did what ever other protester has ever done, claimed he has never had a problem. Cleve and you apparently think the general public are hardware testers with lab experience - and all the associated tools. Let me point out, they are bilwildered experimenters diving in for the first time, very often. Use the gourd you supposedly have, AND THINK ABOUT IT.
Alternately, the last ATI horror was a proud overlclocker with more knowledge than I've seen in these posts, 2 nights ago. He was ready to kill, and now has sworn them off. I'm sure that "never happened" either from your perspective or Cleve's. I know, we've all heard it ten thousand times from the red frreaks, and they have to keep saying it over and over.
Even so, Derek disagrees with both of you, I guess he's wrong too, and PRINTS IT at this very site. I'm sure you can spew 20 paragraphs explaining that away as well.
I quite understand the best thing to say is "it all works", but then we all know that any end user worth their salt does not install CCC - which by the way is part of the package the company sends with their videocard. I know, you're in such a tremendously insane spot of denial with your buddy Cleve, that doesn't matter to you. Problem is, Joe xmas card hasn't got a clue about keeping CCC uninstalled. I'll let ATI know that their end card users are all master technicians and review site bigshots with lab or near lab capabilities or experience, then we can all agree with you, OK ?
BTW - I make a LOT of money because of ATI's screwey setups and drivers. That's how I KNOW. Yes, it would be nice if NVidia did as poorly, because I'd double my earnings in that area.
Now onto the rest of your twisted diatribe.
________________________________

1. I'm sure "it's probably negligible" is a good answer for GDDR5 vs GDDR3 costs, because all of you have the pricing structure handy, and then of course, add in the other costs, as you had to, unable to resist, and what point have we reached ?
You probably didn't notice, but quacking on like you did is EXACTLY the problem. I guess you're probably paid to, at some level, to do so, and can't help yourself. ( Not like I'm any different -YOU SHOULD NOTICE WITHOUT ME HAVING TO POINT IT OUT - except that my job is fixing those problems, not proclaiming they don't exist !)
2. Here once again you go into a long winded BS spiel, and perhaps you've convinced yourself, but it is all your OPINION, and it's so red biased, I don't expect you to even NOTICE IT.
Nvidia is for price gouging. ATI isn't. ATI makes profit, even on the 4870x2, Nvidia doesn't high end. Making money isn't the high end, it's the low end in between( and the thousands of expert lab technicians buying the low end and installing ATI flawlessly of course, because you and Cleve say you don't have problems, along with every other red ever posting here! )...
So we're back to ATI IS GOUGING, not NVidia, THAT'S WHAT YOU HAVE DENIED, BECAUSE IT'S ALL NVIDIA'S FAULT.
Nvidia is LOSING MONEY you claimed, ATI isn't you claimed, their small chip makes it so - so you are just another long winded red fanboy, who cannot help themselves.
I'm really amazed by it, it's quite a phenomena, I hope they are paying you well, because they keep screwing it up and I keep making money from that - and all you know it alls know otherwise ! roflmao
( the last Nvidia call was SEVERAL YEARS AGO - a ti4200 as I recall, the end user marine had games like delta force, and a newer one the playing crew got wouldn't run(call of duty 2 maybe- something like that). A new driver( I showed up - downloaded driver after diagnosis) apparently "emulated" the shader the new game needed, and he couldn't knock me out of the chair quickly enough once it was running).* It was amazing, but I was disappointed, I could have made twice as much - but NVidia had a great driver that solced it all - no new vidcard sale for me* THAT HAS NEVER HAPPENED WITH AN ATI CARD.
Now that's what REAL WORLD is, not you guys and your egghead labs or your endless experience and tricks and all the rest.
The TRUTH is - Nvidia just goes right in, and IN FACT, it goes in through WINDOWS UPDATE - and for some reason ATI just doesn't.
So in the REAL WORLD the place you people don't know exists outside your fanboy monster know it all BS, this is what happens:
A. Consumer buys card tries to install, with bro or daddy, whatever.
B. They may or may not use the provided CD - before or after the card is in and the computer boots
Ba: They get into windows and if the driver is installed, NVidia works, if not NVidia card gets windows upate driver from msft(update on by default in the world out here, who KNOWS what it is in your testing LABS).
Bb: ATI card person has a dot net corruption, they haven't got a clue how to fix it. Bc: It fight like an insane demon with other drivers installed, their system crashes, my phone rings.
(NO THEY DON'T KNOW ABOUT DRIVER CLEANER - AND THEY DON'T HAVE A TEST LAB)
_________________________________________________

On and on it goes - my business phone rings, then I smile at red fan boys.
Please do yourselves a favor and ask your TECH geek friends who actually have to work on something for someone else (THAT MEANS SOMEONE IN THE REAL WORLD, NOT LAB REVIEW ) - maybe then your minds can clear up in the matter.
 

silicondoc

Distinguished
Feb 7, 2008
82
0
18,630
I find it very interesting as well, that according to mr know it all, top end vidcards are loss leaders. LOL
So let's see, the GTX 8800 768 at good golly, what was it - $700 at launch, was a loss leader.
Now we have a $500.00 295, and we're supposed to bleieve they lose money at $500.00 - but of course THAT'S PRICE GOUGING... NO WAIT...
THE TOP CARDS ARE LOSS LEADERS GRAPE APE SPEWED....
Not Ati's though.
Nvidia LOSES money (top end), but is a PRICE GOUGER (top end), ATI MAKES MONEY (top end) - but never wanted to gouge like Nvidia wanted to gouge....
ROFLMAO
THE VERY SAME PEOPLE TELL US ATI CAN DROP IT'S 4870X2 CARD PRICE 50 OR 100 BUCKS AND NOT BLINK. THAT'S CALLED " DESTROYING NVIDIA'S NEW RELEASE". IT'S NOT CALLED PRICE GOUGING - 50 OR 100 BUCKS.
_______________________________________-

Like I've said, there is just no escaping the LIES. Red blood rage is everywhere.
You should see how the reviews of the 295 (actual results) are now all lies except a certain HOcp Crysis chart.. now that is the only one that is correct at this point in time - according to you know whom.
Oh, and don't forget, the 295 is likely going to be a phantom product -( the competitions - to this site- you know who - and YES DEREK isn't here - my mistake - he's there) red ragers at that place have decided it's all a no delivery scam ...
Probably heard it here, too - whatever...
I'd like to thank Tom's though, since up there at the top of this the conslusion page says: " ... we said that Nvidia set out to usurp AMD’s claim over the fastest single-card title. Almost universally, the GeForce GTX 295 does that ... "
That of course is also a "LIE" already, and it's amazing how many sites are filled with the red rage saying so.
I'll remember, I got my corporate training in this chat thread - ATI makes money and is not a gouger or greedy profit corporate pig, Nvidia loses money, and is a raging gouger. Top end cards are loss leaders - because NVidia's top end cards are/we're overpriced, and they gouged like mad monkeys ( but they lost money because top end is a LOSS LEADER /* thank you so much rapedgrapeape for your brilliantly disordered spewing paragraphs.)
Yes, but it's me, of course. It's all me. I've got problems. LOL
Oh the phone is ringing ( not really but I can't resist) - another red card noob with problems... have to go/! ( i'll have to remember that the massive expertise of red users discounts entirely any chance they have any problems with their ati cards - I've decided to say it 100 times, and write it on the tech notes page 100 times, for my penance.)


 
G

Guest

Guest
Wow talk about crapping on the comment section

last 3 pages of comments are crap

interesting read though
 
[citation][nom]silicondoc[/nom]I know, you're in such a tremendously insane spot of denial with your buddy Cleve, that doesn't matter to you...[/citation]

Not really, you've already admittedly openly in the forum you are biased stating specifically "PS- i'm not a fanboy - I'm a REAL HATER - I don't cheer for one or the other side - I pick either side then absolutely HATE IT"

So I already discount your other comments about reds and such.

[citation][nom]silicondoc[/nom]I'm sure "it's probably negligible" is a good answer for GDDR5 vs GDDR3 costs, because all of you have the pricing structure handy, and then of course, add in the other costs, as you had to, unable to resist, and what point have we reached ?[/citation]

Same as before, you talk about the cost of GDDR5 vs GDDR3 when talking about the per card return, of course you add in all other costs, because you're talking about a per card cost, why wouldn't you include everything else, and just focus on GDDR5 unless you were ignorant of that other imbalances. Still doesn't influence the chip size since the GDDR3/5 issues remains an AIB concern not a chip issue for one that's smaller yet still includes the added support without the need of an NVIO, so for your small chip vs big chip complaint, you're reaching, just like in your other posts both at THG and other forums where you troll from place to place often blaming Google for your late participation to add your worthless 2 cents (must be Zimbabwean cents).

Simple solution; if you think the GDDR5 is such an issue to make up for the chip yield differences, then post your numbers to support your claim, otherwise you're slinging BS as usual.
 
G

Guest

Guest
Perhaps we'll see increasing amount of DX10.1 games to bust this thing performancewise.
 

silicondoc

Distinguished
Feb 7, 2008
82
0
18,630
You weren't aware of other costs - blah blah blah blah blah - and the idiot can't read the point between the lines, and instead makes a gigantic PRETENSE, as if to take it seriously.
Why expect anything else from a liar ?
I certainly don't.
The point remains, all the little spewing bratmouths who have claimed NVidia is a scalper, and then fawn over their red corporate love, have got it 100% screwed up, because they whine and wince about NVidia gouging, and claim at the same time NVidia loses money, then further their insane mix with claiming the gouging company can't make a dime, and in fact LOSES money at the prices before us, and then claim the top end is a loss leader - so of course there was no gouging or profit to begin with!
ROFLMAO
Yes, the red fanboy is twisted into a SICK pretzel of lies.
Time to STOP IT.
Even the dummy with the rebuttal did it - in his "explanation" !
ROFLMAO
I certainly DON'T expect that you'll change, reds, but at least you have a chance now to know how crazy you are.
Take a look in the mirror.
 

cleeve

Illustrious
Wow. Wow!

I'm honestly surprised there are still hardcore ultra-diehard fanbois out there who are passionate enough about their brand flag to abandon reason entirely, and cling to wild conspiracy theories instead.

I mean, the only possible explanation is that people who speak favorably about Ati cards are liars, and are out to get you, and want everyone to waste their money on bad products!

Good for you, Silicondoc. Keep the dream alive, bro!
 

cleeve

Illustrious
[citation][nom]TheGreatGrapeApe[/nom]Am I suggesting that this preview shouldn't have been published at all, yes, as it is now, yes, that's what I'm suggesting. Would I say there should be none, no, but the timing and the content just don't seem right. I would've prefered no mention of potential power ratings (especially when pulled from the system, not the actual card) that can't be posted let alone confirmed, and I would've preferred no mandate. [/citation]

Well, I'll have to agree to disagree on this one dude. Purely as a reader, I enjoyed the piece; I found it interesting, I think Chris managed to give us a good idea about the power requirements of the card regardless of the constraints, and I personally don't think the 'mandate' as you put it affected the meat of the preview by one iota.

To me, the term 'preview' instead of 'review' is a strong hint that we're taking a peek into something that isn't quite available or even finalized yet - and all that that implies. It's a chance to form some forecasts and see how close those end up being to reality when the time comes.

We can debate back and forth all day, but what is the 295? It's a slightly underclocked GTX-280-SLI-setup-on-a-card-with-a-die-shrink. I don't know what more or less you would expect from this piece of hardware. It's going to run well in stuff that SLI'd 280's run well in, and it's going to run poorly in stuff that SLI'd 280s run poorly in, with maybe some small deviations. I just don't see the potential for major public misleading going on here that you do, Grape. I dunno, maybe I'm too trusting.

Ah well, I think we know where each other stands. As far as GPGPU stuff, I've got my eye on it but I'm waiting for more 'killer apps' to get into the hands of the public before a proper comparison. Not sure there's enough real-world stuff out there for a GPGPU story interesting to a bunch of folks yet, but maybe I should keep my ear a little closer to the ground on that one.
 

Nicku

Distinguished
Dec 13, 2008
3
0
18,510
@malveaux: until nvidia launches this monster, amd will drop price for 4870x2, also google for "rv775" and i already see an X2 made from it.

I always appreciate competition, this is a good card, even if they needed 6 months to build it (is a little too much time), but better late then never. Good preview Tom's, as always :)
 

DarthPiggie

Distinguished
Apr 13, 2008
647
0
18,980
"the GTX 280 is also quite a bit faster than the Radeon HD 4870 512 MB. Yes, the 1 GB version would likely show some improvement here, though it’d cost an extra $40, too."

What the F@#k is that?!?! Costs an extra 40$ too...Ughh, utter bias, the cheapest 4870 on the egg is 200$, the cheapest 1GB version is 225$, Sapphire's version, not only wrong, but since when do you guys have the price of card prevent you from benching it? I would also have liked to see lower resolutions. Even though SLI and XFire show their worth at higher resolutions, not all of us have such large screens. 2560x1600 screens start at $1k.
 

Lurker87

Distinguished
Oct 30, 2008
145
0
18,680
Very nice article, guys. I'm looking forward to your complete review of the 295 when it comes out. Regardless, just reading what you have so far is enough to get me excited! Keep up the hard work!
 

sparky2010

Distinguished
Sep 14, 2008
74
0
18,630
[citation][nom]DarthPiggie[/nom]"the GTX 280 is also quite a bit faster than the Radeon HD 4870 512 MB. Yes, the 1 GB version would likely show some improvement here, though it’d cost an extra $40, too." What the F@#k is that?!?! Costs an extra 40$ too...Ughh, utter bias, the cheapest 4870 on the egg is 200$, the cheapest 1GB version is 225$, Sapphire's version, not only wrong, but since when do you guys have the price of card prevent you from benching it? I would also have liked to see lower resolutions. Even though SLI and XFire show their worth at higher resolutions, not all of us have such large screens. 2560x1600 screens start at $1k.[/citation]

It's.. would've been nice to see the 4870 1GB, but then again, this was just a heads up/preview article.. so i'm guessing that when this card is finally released, they'll include more cards.. hopefully the 4850 x2, as i'd like to see how that one performs comparably.. and crossfire x vs. quad sli! this should be interesting.. might see some interesting results!!

On the other hand, and this has been stressed on ALOT by Tom's and other sites, dual gpu cards, sli and crossfire, setups are expensive.. and are designed to strut their stuff at the higher resolutions.. meaning 1920 and above! there's no point in getting a 4870 x2 or 280 gtx in dual sli and then playing on 1600 or 1024 at 300 fps.. that's useless.. people who have the money to buy expensive cards like these more often than not have the larger screens.. so it's good enough that they included the resolutions they did!

and again, wait for the official review when this card is released.. the article would be more comprehensive with more cards, resolutions, and all that..
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
Grape,

Guessing we’re going to have to agree to disagree on this one. The idea here isn’t to help anyone except for readers by giving them information. A disservice would have been to withhold any of the situational details that were divulged, thus *not* providing the full story. I’d like to think that readers are able to assess their own situation and make their buying decision (ex. I have $xxx dollars, want to make my purchase by xx/xx/xxxx and will use all available data to make that choice). If it turns out that Nvidia doesn’t hit availability by the data cited, then that’ll certainly be mentioned in the future—but also, it’ll affect the actual purchase of the enthusiast who might have been waiting until the 8th, didn’t see the card on NewEgg ,and went ahead with an X2 using his Christmas money instead.

If someone jumps straight to the results page without reading the details, and make their choice based on that text alone, that’s their own lack of due diligence. All of the data is here, and I wrote it to be read. Again, full disclosure is here. Hypothesizing about whether this product is a paper launch or not is a non-issue until the launch happens. For now, we have performance details in a *preview,* though.

Time and again I will maintain that an abundance of information is superior to a lack of it. Let’s say the preview was held until after Christmas, if only to not play a role in a holiday buying decision. Then the card launches on the 8th and my inbox blows up about “why couldn’t you have at least told us about this card earlier so we could make the choice for ourselves? Instead I just spent $500 on a card not knowing what’d be out in two weeks.” See? It can work both ways.

The time for calling out paper launches is a couple of weeks away—eagerly awaiting the 8th!

So, did you at least get a sweet jump in exchange for the bruised ribs?
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
Darth,

Quite plainly, I don’t have one here. AMD hasn’t seen fit to send along the 4870 1GB or the 4830. We’ll work on rectifying this for the full review, but for the brief window available for this piece, that card was simply not available, and the best I was able to do was explain that the 1GB version would, in fact, likely do better.

/tinfoil hat off

Chris
 

philosofool

Distinguished
Jan 10, 2008
49
0
18,530
And the award for fastest unaffordable card goes to...

nVidia.

It's hard to care until I see nVidia bringing this to the affordable performance price range of ~$200.
 

randomizer

Champion
Moderator
[citation][nom]philosofool[/nom]And the award for fastest unaffordable card goes to...nVidia.It's hard to care until I see nVidia bringing this to the affordable performance price range of ~$200.[/citation]
That's like selling a Ferrari for $15000.
 
Just so people understand, I'm not against the card, the idea of the preview or a look at upcoming technology (we see alot of Larrabee stuff out there, Direct X 11 presentations at GDC, etc. and are greatful for it). The issue for me was the timing of the nV initiated action, and their duiging hand. It's like the laws regarding canvasing/advertising outside polling stations. The main motivation for nV to give this sneak peek (something they rarely ever do, just like ATi, intel, etc) is to make people second guess their purchases during the Xmas shopping season. This is why paper launches and FUD were effective in the past, and then treated with such animosity that the practice diminished. The thing is that, THG is not alone in printing this preview, it's just sad that we've returned to preview paper launches instead of actual hardware to keep people from buying the competitors' stuff. That they offer a sneak peak to everyone during the Xmas buying season, yet a day/week before launch of the GTX280 would've blackballed any publisher for any 'sneak peek' shows the strategic importance of this to them.

It just reminds me of the line from The Departed as to why nV let's reviewers do this this time around "Qui Bono? Who benefits?"

[citation][nom]Cleeve[/nom]To me, the term 'preview' instead of 'review' is a strong hint that we're taking a peek into something that isn't quite available or even finalized yet - and all that that implies. It's a chance to form some forecasts and see how close those end up being to reality when the time comes.[/citation]

I understand that, but that's when I prefer generalities about performance but specifics about hardware and features/interactions in previews and not specifics about game performance, because the results get invalidated by shipping drivers and even by the competition's drivers, etc. If it were a preview similar to all the preview information we've had of Larrabee with only cursory mention of performance, then I'd be fine. But when in the past have you known nV or ATi to allow let alone support benchmarks being posted of unreleased products, they both get miffed about powerpoint leaks, that they are doing a wide canvas of sites for previews just reinforces the idea that they know this has a major benefit to their market situation. I'd love to have seen a Hexus style right of reply but not of the Mfr being reviewed (nV), but get ATi, intel and S3's reaction to this preview. I
I'm sure they'd be as interesting a read as the previews themselves.

[citation][nom]Cleeve[/nom]and I personally don't think the 'mandate' as you put it affected the meat of the preview by one iota.[/citation]

Just to be sure, it's not as I put it, Chris called it the list mandated by nV. I just don't like pre-conditions, and so I stress the term used by the author. For balance since ATi cards were used in this nV preview, perhaps their PR guys should have been contacted for their preview mandates.

[citation][nom]cangelini[/nom]Grape,Guessing we’re going to have to agree to disagree on this one. The idea here isn’t to help anyone except for readers by giving them information. A disservice would have been to withhold any of the situational details that were divulged, thus *not* providing the full story.[/citation]

We'll definitely not see eye to eye on it and as you and Cleeve mention, we'll just have to agree to disagree. But two points I think would've made me a little less opposed to a preview, would've been sticking to nV vs nV benchmarking, as is done in many previews by places like Beyond3D, this gives people a glimps of what's coming without getting into the A vs B situation that nV obviously wants to happen from this strategic 'leak'/preview. Also the *full* story wasn't provided like you say, since you were restricted from revealing all, but still made mention of it in some cases. These are minor issues, but they definitely would've removed the tinge of being used by the IHV's PR department that I felt when reading.

[citation][nom]cangelini[/nom]If it turns out that Nvidia doesn’t hit availability by the data cited, then that’ll certainly be mentioned in the future—but also, it’ll affect the actual purchase of the enthusiast who might have been waiting until the 8th, didn’t see the card on NewEgg ,and went ahead with an X2 using his Christmas money instead.[/citation]

Yes, but only after they've already influenced the market numbers for their competitions' products, those sales are not going to reflect the 4th quarter of '08 like they should have. That's part of the issue, and accomplishes the same task that FUD does. If someone isn't buying the competition's product today, it's another chance we might get their money tomorrow. Once again, "Qui Bono? Who benefits?"

[citation][nom]cangelini[/nom]Let’s say the preview was held until after Christmas, if only to not play a role in a holiday buying decision. Then the card launches on the 8th and my inbox blows up about “why couldn’t you have at least told us about this card earlier so we could make the choice for ourselves? Instead I just spent $500 on a card not knowing what’d be out in two weeks.” See?[/citation]

However, I doubt we'll get that same treatment next time the IHVs have an NDA they don't want you to break, but would be in the best interest of the reader if you did, right? The benefit to the reader would be the same then as it is now. But I have a feeling neither nV nor ATi want you to let people know in advance when they have problems and bad news headed consumers' way, or if they're about to launch an underperformer people were waiting for. No matter what people are going to feel they got screwed at some time, this time all the previewers 'screwed-over' the people who bought their cards before December 17th, not Januuary 8th.

[citation][nom]cangelini[/nom]So, did you at least get a sweet jump in exchange for the bruised ribs?[/citation]

Not on that part, more like an unexpected bump , binding release , ejection, then chest impacting light powder covered stone situation. There's really not enough of a base out there for serious jumps just yet.
Good day of 30-40cms of powder through the glades and flats though, so kept skiing. Monday night was when I felt it most and knew it wasn't just a bad bruise.
 

randomizer

Champion
Moderator
Hey Chris, did you try running Fallout 3 with the 180.88 drivers before NVIDIA told you to use 180.87 instead? Apparently 4xAA would crash upon loading a game. Not a good idea having "preview" drivers that don't work :lol:
 

neiroatopelcc

Distinguished
Oct 3, 2006
3,078
0
20,810
With regards to sparky2010 :
"The problem with ATI is that they release good products but give them incomplete/unoptimized drivers"
[citation][nom]Cleeve[/nom]Drivers seem fine to me. Remember, the 4870 wasn't designed to be as powerful as the GTX280. It was made to be more efficient, cheaper to manufacture, and scalable.[/citation]

It does what you say it was supposed to. But it does not provide a stable gaming platform, a convenient platform for multi monitor setups, or a convenient platform for non-stock enclosures!
I replaced a failed 8800gtx with an 4870 this summer, and while the performance is adequate to my expectations, everything else is not!
The drivers keep crashing when I have my 19" lcd as secondary, and if I put my old tv as secondary I can't play games on the primary screen (22") at higher resolutions than 1024. And I still have that annoying fan spinup issue a short while after booting. And did I mention the graphics drivers crashing? I did - but I'll do it again. It crashes on average once or twice a week. Unable to reproduce it, it happends mostly when I'm running wow and have a secondary monitor active, or when I'm playing flatout ultimate carnage. But then, that's probably because I only play those two. Having abandoned farcry 2 because it performs so bad I stopped trying to play it.
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
randomizer,

As far as I know, I'm the one who reported it to Nvidia (I could be wrong on this). But yes, banged my head on Fallout 3 for a good two hours before I called and let them know the game just wasn't running right.
 

randomizer

Champion
Moderator
[citation][nom]cangelini[/nom]randomizer,As far as I know, I'm the one who reported it to Nvidia (I could be wrong on this). But yes, banged my head on Fallout 3 for a good two hours before I called and let them know the game just wasn't running right.[/citation]
That's priceless. But at the same time it's typical. NVIDIA beta drivers that work 100% in everything you want them to are few and far between. I think the first one I found after 175.19 that worked fine was 180.48. Of course, it depends on your system config too. Drivers are such a pain.
 
Status
Not open for further replies.