HD 2600 & GeForce 8600: Where's the Mid-Range?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


I recently (by that I mean 2 hours ago) purchased an X1950XT from Newegg for 180.00$. How can this card possibly compete with that card at a higher price? The article is right: midrange is pathetic these days! 128 BIT MEMORY BANDWITH??!?! HAVEN'T WE GOTTEN PAST THAT YET!!?! Sorry but they are a HUGE dissappointment to me, because I was looking to replace the old X850XTPE with a nice 8600. Sigh...
-cm
 
running 8800gts, i never use mid range cards anyways, but i got it release.

u were running high end before i say dont switch to a mid range by any means, fact is if these cards as of now are ok and i mean they do beat the old mid range cards although it shoudl relaly be beating the high end of last gen. im in canada and the 2600xts can be had for 125, if they are really bad for native dx10, nvidia and amd will surely update or turn around. im pretty sure these will still be able to handle games lik crysis with decent settings 1280x1024
 
Interesting article, but in your comparison chart:

GeForce 8800 GTX GeForce 8800 GTS GeForce 8600 GTS GeForce 8600 GT
768 MB 640 MB 256 MB 256 MB

You left out the GeForce 8800 GTS 320 MB, perhaps the closest thing to a midrange card in this group. Could you please update the specs to include this card?

BTW, I'd really like to see you start including 3D rendering apps in your benchmarks. How a video card performs in Mojoworld or Lightwave or Maya can be just as important as how it performs in a game.
 


I'm certain both AMD and nV are thinking of refreshes, but we'll likely not see them until back to school or later. The fact that they tied each other in suckiness means no one is pushing them until maybe S3 enters the market in the fall, but I doubt they'll be much better than the current offerings and I would suspect them to be competing against the HD2400 and GF8400.

I'm not sure if the GF8600GTS and HD2600XT will do well in Crysis. The early review with the GF8700M-GT (which is between a GF8600GT and GF8600GTS in specs) seemed to only run at low res (800x600) and with setting kinda medium (low depth of field, no AA, mediumish textures) and they mentioned stuttering in the review. I suspect the desktop versions to be not much better, and I doubt they'll be running at 1280x1024 except on lower settings. Medium I suspect will be for 1024x768 or maybe less, but that's just a guess, and only the shipping product will give us the final verdict on these cards.
 
Ill throw in some speculation here (who me?) What Im seeing is a reluctance in many avenues because of DX10, Vista, PCIe2 etc. Also, we have a new contender on the horizon in larrabee (Intel) as well as S3. If there isnt DX10, and theyre selling overpriced/underperforming cards, doesnt that leave a wedge/door for, as Ape said, S3, and even for Larrabee? To me this is foolish marketing, letting the giant in the door. Both ATI/AMD and nVidia dont have as much time as theyd like to think, being as the competition will be heating up next year. Theyd better take care of their consumers now BEFORE this happens My 2 shekels
 


Whoops.. I didn't realize that nVidia had already announced/created an 8700(M). Apparently it's a mobile (laptop) part. I don't really care what they call it (8700, 8800GS, whatever), but they have a huge hole to fill in thier desktop lineup between the 8600GTS and 8800 GTS 320mb. If they come out with the successor to the 8800 (at current 8800 pricing) and drop the 8800's down a tier in price, that would work too 😉.
 
Just FYI, the GF8700M-GT is actually between the GF8600GT and GTS in performance, so unfortunately it's not as good as we hoped (and WTF is the GF8800M going to be that it might draw 22W? :heink: ).

The card I think you're thinking of to fill the gap is the one that was originally touted as the GF8600Ultra, then the GF8800GS, and now is in limbo for a name (likely won't be either since nV denied the existance of both at some point). It's commonly known know as the 65nm GF8800GS, which is supposed to battle the also rumoured HD2900Pro (which is now rumoured to be 55nm). Main problem with just dropping the GTs into the segment is it's still a 700million transistor part whose role could be replaced by a ~450-500 million transistor part on a smaller process. Like the R9500Pro, I think it makes sense for nV to move to something like that baring and issue with the process (like leakage) that would compromise yields.

Sounds like Darren knows more about them than he can say, all I hope is that they arrive before Crysis for some people or they may feel tempted to pull the trigger on a bad GF8600/HD2600 decision.
 

Well it doesn't have video-in then. Dang, I think you knew what I meant. Its the EAX1950Pro from ASUS. Heck of a card, but no video in. Google it.
[fixed]That aside, the EAX1950PRO features two HDCP capable dual-link DVI connectors, and video out capabilities (meaning no video input here).[/fixed]
http://www.elitebastards.com/cms/in...sk=view&id=187&Itemid=27&limit=1&limitstart=1
 


Well he did suspect what you were thinking, but what you said was different. But whatever...

[fixed]That aside, the EAX1950PRO features two HDCP capable dual-link DVI connectors, and video out capabilities (meaning no video input here).[/fixed]

Funny thing about that segment, from it you'd suspect that the X1950 was capable of Dual-Link HDCP support, while in fact it's got Dual Link support on each DVI output, and separately each of which supports single link HDCP. The HD2K and GF86000/8500/8400/8300 (not Gf8800) series added Dual-link HDCP support.

So the X1K & GF7 cards with HDCP support can view protected content up to 1600x1200-1920x1200 (depend on blanking & refresh requirements), while the HD2K and newer GF8 series cards can go up to their blanking limits of dual link around 25x16@60hz.

Anywhoo, not a big deal, but still worth making sure people know the correct info for both issues.
 
toms butchered my link, but oh well, you get the idea. MY mistake on saying AVIVO, truth is, I didn't know

I naively thought VIVO was Video In Video Out..... my bad
 


From what I've read before, as of today 3D Rendering apps utilize the vid card only to display the objects while modeling. When you're about to render/save what you've done, ONLY the processor is used. That's why the 3DSMax encoding is included only in the CPU Charts and not in the VGA charts. With CAD applications you get a pretty different picture, as in AutoCAD's own website, it doesn't recommend consumer video cards to be used as they have lacking features that AutoCAD uses.
 
Until they get a true,low power, midrange card, and all of the bugs out of DX10, I am out. It seems what they are really trying to do is push everybody to consoles.

We shouldn't have low power, dual core, 45nm GPUs by now. Hello, Intel, how about a discreet graphics card?
 
Darren Polkowski wrote: "As for those of you who are still in a holding pattern: you need to dive in soon." What a load of bollocks! Those of us in a holding pattern should wait and get DX10 hardware that can actually handle DX10 Games that have been written to take FULL advantage of DX10 and not WASTE money now! I am quite happy gaming in DX9 for the moment with my x1950pro which holds it's own at 1280 x 1024 (note: real world resolution) with 4x aa in my fave games! So Whilst I have a (free) copy of Vista Home Premium I still Game on XP as Vista has some issues (such as having to diasable 1 core to get 3d to work!) with 3d that are yet to be resolved!
 


We all know that 3ds max and other 3d modeling software uses cpu power to render the final image. But viewport speed is realy important too. From what I have seen the professional cards like the quadro and firegl dont accelerate 3ds max that much and I would like to know as I am sure a LOT of people would like to know how specific cards work in 3d apps. There are a lot of people asking whether to buy a quadro card or a geforce for specific app and trust me this is really a valid question.
 
Soon as in when the next cards come out... that will be in 1-3 months. That is soon for most people saving for a graphics card.

I think that DX9 cards is still where the price is best. The performance on these current "new" cards is less than stellar but there are new cards comming soon.

5 more months and we are in a new year. Then we can say hello to Intel as they enter the graphics market.
 
Great article. My brother and cousin typically aim for the $100 video card since they play a lot more console games than PC games. I tend to play a few more PC titles so I got myself a Geforce 7900GS a while ago, and this article fits our segment well. All of us right now are pretty satisfied with our video cards for the moment. Of course we would love having 8800GTS cards, but there's other aspects of our PCs that could use upgrading before the graphics cards.

I saw that the 8600 video card got released a little after I bought the 7900GS, and to me it's not worth it to get an 8600. Now, if NVidia makes an 8700 card that's well-priced, and the games on my old 1024x768 LCD monitor start slowing down significantly, I'll want it for sure. For now, I haven't run into that one game that I can't play at a decent setting.