DX10 !

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Well how much advantage did the R9700 have over the GF4/R8500?

GF6 over FX?

Seriously it's not just about the DX support but the raw horsepower of the new cards.

That's why I asked. I haven't seen any of the new cards, or any benchmarks. What have you seen?
 
Well the only DX10 card I have seen is not impressive, and I doubt it'd challenge a GF7600 or X1650 (maybe an X1600).

However based on the specs we've got to chat about, and the statements of ATi and nV, I have a feeling that we're in store for 2 nice boosts. When the card initially launch and just give us more raw fill rates, etc thanks to more components and faster speeds. And then a second boost once DX10 benifits kick in.

Hey I could be wrong, but it's unlikely, since there'd be far less motivation to fork over money if an R600/G80 can't beat an aging GF7950GX2.
 
Hey I could be wrong, but it's unlikely, since there'd be far less motivation to fork over money if an R600/G80 can't beat an aging GF7950GX2.

Sure, my assumption is that the next gen will be a solid step up. But I've been a little let down a time or two in the past when expecting great things so I'm progressing in my research to get a PhD in cynicism.
 
Sure, my assumption is that the next gen will be a solid step up. But I've been a little let down a time or two in the past when expecting great things so I'm progressing in my research to get a PhD in cynicism.

Oh I agree, to me there's been a few let downs, and I suspect that the R600/G80 will not be giant leaps in performance yet.

It's funny I think we are let down by the comparison to our current expectations. The R9700P wasn't that impressive compared to the GF4ti with weak titles at 800x600 or 1024x768, but as the resolutions went up and the workloads went up then the R9700P shone with it greater power on both the core and the memory. Same thing with the X800/GF6800 and GF7800/X1800.

True DX10 titles (built for DX10, with no fallbacks) will likely run like crap ont the new hardware because by then will be pushing the envelope of some other new cards, but they should perform better than current cards in DX9, but how much better who knows. Probably like 25% if history keep it's performance trend, but we could get some surprises, and I think the biggest unknown and bonus, is how they will handle once they can benifit from DX10 optimizations.

The G80 might be the next FX, but like the high end FXs people probably won't care until we are 2-3 years past their launch when finally the hybrid design might show a weakeness, but even then I doubt it'll be the disaster that the FX was because it should do very well with current titles and not requires tricks to perform better. I think no matter what both will be solid performer in DX9 and should outperform anything we have to date.
 
I think no matter what both will be solid performer in DX9 and should outperform anything we have to date.

I'll start the process of building patience. "I promise not to think about buying a DX-10 card till the prices drop 30%. I promise not to think about buying a DX-10 card till the prices drop 30%. I promise not to think about buying a DX-10 card till the prices drop 30%. I promise not to think about buying a DX-10 card till the prices drop 30%. I promise not to think about buying a DX-10 card till the prices drop 30%. I promise not to think about buying a DX-10 card till the prices drop 30%."
 
I would never promise such a thing, because if it's cracker-jack good, why wait if it's in your sweet spot?

Of course for me it'd have to be a mobile solution we're talking about.

I bought the R9600P the first week because it perfectly fit my needs at the time, same with pretty much everything I buy that doesn't require financing.
 
I would never promise such a thing, because if it's cracker-jack good, why wait if it's in your sweet spot?

My promise was made a bit as a joke to show my wife. I put together three AMD rigs in the last year, one of them with XFired 1900s, another with a single 1900 and the last with an 1800. Along the way, our large male Golden vaporized my laptop and I had to replace it. I like to stagger PC purchases but the needs dictated the purchases. So I'm already queing up to replace the home office PC with a Conroe and part of the spousal negotiation process has involved commitments on my part regarding GPUs... And that's fine - I don't upgrade often and do not chase the bleeding edge. When I do upgrade, I try to get very good performance per dollar and thus I'm planning to be patient on the Conroe rig because I hope to find a decent deal on mobo/CPU/RAM. I don't really expect that to be happening for something like 4 to 6 months - maybe more. I probably won't find the time to fully tweak the three newish AMD rigs by then anyway as I am an ultra patient overclocker.

I bought the R9600P the first week because it perfectly fit my needs at the time, same with pretty much everything I buy that doesn't require financing.

For sure. I've done the same thing. My current digital camera had barely settled onto the shelf when I nabbed it and it has turned out to be super. I'm at almost 10K images and it's barely broken in - not that the number of images is the key quality issue but it does show that I like it.
 
It boils down to this....when there are killer DX10 games available, we'll all buy DX10 hardware, until then, these threads need to die....

kthx

Thats all anyone really needs to know on this subject.

The mods should just deleat all these threads untill after DX10 is even out!
 
My promise was made a bit as a joke to show my wife.

Yeah, and I did get that, mine was more oh an 'oh nonono can't prmise such a thing for my precious'. But reading it now it does take a more serious tone without mileys. That's what happens when you write quick a work. 😳

For sure. I've done the same thing. My current digital camera had barely settled onto the shelf when I nabbed it and it has turned out to be super. I'm at almost 10K images and it's barely broken in - not that the number of images is the key quality issue but it does show that I like it.

Yep I know the feeling. And who says taking alot of pics isn't the key to quality, it's like the infinite number of monkeys using an infinite number of cameras will eventually take the best picture ever? :mrgreen:

OK, now that's funnier. Anywhoo, yeah I know the feeling. I'm curently debating what to get next for my DigSLR, and I'm really having trouble because I have free accesss to a Kodak-14n, so my motivation level is low. But I have a feeling I'd take 100s of pictures the first weekend I finally do buy just to test all the new features and compare to what I'm now used to, etc. IT was like OCing the R9600P just for the fun of 'I wonder what it can do?'.

Anywhoo main thing is that people find their comfort zone, because we can say an X1900XT/GF7900GT is the best buy, but perhaps for their level of comitment or budget they really should be getting an X1800GTO/GF7600GT or maybe waiting for DX10 is pointless for them since the mid-range might not fully ship until next spring (although there have been rumours for both of full lineups within 2 months of launch, but I wouldn't put money on that).
 
And who says taking alot of pics isn't the key to quality, it's like the infinite number of moneys using an infinite number of cameras will eventually take the best picture ever? :mrgreen:

Yea, practice may not make perfect in my case but at least I get to eat a bunch of bananas!
 
Dang, I didn't notice until just now I misspelled MONKEY! 😳

BTW, the way I look at it, if I take 3 pictures of the same thing, maybe one of them will be a keeper, and thanks to digital, erase the others. 8)

Still when my girlfriend took 10 picutres of the same rooster when we were in France in the first day of our vacation with a FILM camera I did say, WTF?!?
when I got them back from the lab. 8O At first I thought we got double prints. :lol:
 
However the benifits of Crysis may e enough to make the DX10 bonuses worth it. Not required, but definitely enjoyed. I thnk Halo2 may be the only DX10-only app (by design) until about 2008. Even UT2K7 will have a DX9 fp24 fallback supposedly, so not even just DX9.0C, but about the same level of fallback as Oblivion (although I suspect it might play worse on an R9600 or X700 than Oblivion).

Use and need are different animals, like I mentioned UT2K7 and Crysis will have a use for it, just not a 'NEED'.

I agree, if anyting they'd move to those wall socket solution we've seen from ASUS.
Well, I did actually pause and think on that word quite a bit; originally, I was going to put "need," but then I realized that as of date, 99.999% of games only "need" a DirectX 8.0 card; Oblivion's the only major title that I know of, to date, that even needs a DirectX 9 card!

I wasn't aware of Crysis, though; I'm not sure what use it might have for SM 4.0.

Oh, and I think those "wall socket" solutions are perhaps a stupid idea. Yes, it circumvents the need for a separate processor, but those things are notoriously unstable and fragile. I have enough cords running around my desk already...

wont Dx10 cards come out before vista??
I personally have no clue. I'm not even sure if R600 and G80 will be actual DirectX 10 hardware. Because ATi's still got R580+ (X1950) and RV575 (X1700) in the wings, I'll guess they're holding off on R600 for now, and that nVidia might be as well, what with all their new focus on "quad-SLi."

That's why I asked. I haven't seen any of the new cards, or any benchmarks. What have you seen?
All I've seen are rumors (albeit mildly strong ones) and nothing more. AS you might've heard, R600 supposedly does move to a unified shader arcitecture, and will have perhaps 64 shader units; each has 2 ALUs, so it can act as either a single pixel shader, or as a pair of vertex shaders. (two units to process a vertex in one clock cycle) This will provide perhaps only a modest gain over the X1900 series in terms of pixel power, but a potential bonanza when it comes to increasing vertex power. Given that the R 580+ uses GDDR4, the R600 will likely use the same. I'm not even willing to guess at anything else on that chip.

As for the G80, the prevaling opinion seems to think it will be "32-pipeline." I'm not sure if they're heading for a unified arcitecture as well, though.

I'm fairly certain you've already heard the above; frankly, I can't find much out about anything here.

Here's a hint, and scavenger hunt.

Start @ HKEPC then look for the 3rd thing you see about DX10. :twisted:
Some people do indeed forget that there are companies other than ATi and nVidia...
 
Ok everybody is going on about DX10, is it really such a big thing ??

I'm looking at maybe updating my PC when Conroe is available, which i believe is very soon, and with it my GC.

I was looking at buying a 7900GT 256mb, but i'm not sure the way everybody is going on about DX10.

1. When is DX10 suppose to be coming out ??
2. Would a 7900GT card not be able to play DX10 stuff, with a bios/driver update, surely it must.
3. When should i update my rig, not bothered if its 2,6 or 9 months ?

1. The DX10 API is set for some time next spring but DX10 GPUs are comming next month. Intel next month is set for an intergrated and in a slot card release. Nvidia in about 2 months will release G80 with SM4.0. ATI wil release its 80nm GPU's in November and its 65nm GPU's in December.

2. Yes but your talking about emulations which will cut its performance down quit a bit. The 7900GT will only be about half as fast as the top DX10 GPU's so if you emulate up to DX10 your going to see performance only a little higher than a entry level DX10 GPU.

3. I would wait as waiting never hurts and options only get higher performance as time goes on but thats a quesion of if your current system will due until then.
 
personally, i am NOT going to be buying a DX10 card because of its rediclous wattage requirements, like you will need a 1000W PSU or a dedicated GFX PSU for a DX10 card so I say;

Buy a nice 4MB cache core 2 duo CPU, with 2GB of DDR2 800 RAM, and an nVidia 7900GT or an ATi Radeon X1900GT or X1900XT depending on how much GFX power you want.

p.s. EVGA and XFX make good nvidia cards.

I have read plenty of articles about this and I agree with you. This situation will be like Intel. Intel's Prescott processors required a large amount of power. As technology devolped, Intel Core Duo 2 (Conroe) came out which improved performance while greatly lowering power requirements. Why couldn't this be the same for graphics cards?
 
I wasn't aware of Crysis, though; I'm not sure what use it might have for SM 4.0.

Well the true use I'm not sure, other than some great performance benifits like memory paging to allow for much larger textures, pre-caching, greater use of efficient geometry modeling (which should help foliage the way geometric instancing did, except likely better, thus lush greenery without the hit of Oblivion), there's talk of soft-self-shadowing, soft particles with diffused lighting (making jacob's ladder light effects through foliage/clouds/water/smoke more realistic), and I would presume greater use of true parrallax maping finally. The addition of Direct physics will likely at to DX10, but have nothing to do with SM4.0 of course, crysis being the VPU-physics darling, UT2K7 being the PPU's.

Oh, and I think those "wall socket" solutions are perhaps a stupid idea. Yes, it circumvents the need for a separate processor, but those things are notoriously unstable and fragile. I have enough cords running around my desk already...

True I just think if people are anal about the whoel PSU quality and amperage across the rails, this ensures that the cards have the power that ATi and nV expect, although as you say they can be dodgy, and of course get a bit hot.

AS you might've heard, R600 supposedly does move to a unified shader arcitecture, and will have perhaps 64 shader units; each has 2 ALUs, so it can act as either a single pixel shader, or as a pair of vertex shaders. (two units to process a vertex in one clock cycle) This will provide perhaps only a modest gain over the X1900 series in terms of pixel power, but a potential bonanza when it comes to increasing vertex power.

Yeah and that's the thing, because it's still only 16 ROPs on top of the increase pixel shader units (even at current dedication of Vertex acting in Geometry in an equal number, you still have potential for 8 extra pixel shader units, and at an increased clock speed as well, so there's alot of potential there, and in situations that can heavy load either pixel or vertex we probably won't see an major increase in avg or max framerates, but likely see a jump in the minimum fps, and hwo often we experience crushing dips.

As for the G80, the prevaling opinion seems to think it will be "32-pipeline." I'm not sure if they're heading for a unified arcitecture as well, though.

I would think not, hybrid makes sense from their long term perspective (IMO expect them to do what they did with the GF6600GT following ATi's lead somewhat and introduce the changes in their mid level product, so likely the 1st refresh after the launch of the mid-range will be a unified design if not that card itself (unlikly and riskier, but could happen) . The hybrig will allow for fixed pizel layout but unified geometry and vertex which would be a wise choice for hybrid if they aren't going ful out unified.

Here's a hint, and scavenger hunt.
Some people do indeed forget that there are companies other than ATi and nVidia...

LOL!

Exactly, heck some people even forget Intel, and probably the first one to market will be the biggest, but it'll also likely perform worse than all others for that generation, but I wouldn't be surprised if we see the GMA965 outperforming the early GF7300s (before the GT refresh) and X1300HMs.
 
Yeah, but it's nice to see something, instead of thinking they were just going to fold up and go away.

BTW< they do video playback pretty well so they're likely targeting the HTPC market, and I wouldn't be surprised if they do quite well once they learn how to market their wares.
 
EDIT: Proof reading my own post lead to me one conclusion: I have poor written expression right now, and I shall retire because a much needed sleep is in order. I've been extremely busy for the past couple days...and I missed you guys :lol: :lol: :lol: