Fill me in on DX10 cards

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
That's not a bad idea. People love those oversized coolers. The only reason I prefer them is because they exhaust hot air out- rather then use the hot air inside and/or create more hot air inside the case like most other coolers. Plus, even though the GPU might not need it, having a beefy cooler is never a bad thing. 8)

Yep the arctic cooling HSFs are great (sure hope ATi's next cooler doesn't lose it's good points). And really it does make people go, Ooohh Kick-Ass! Of course they're on X1300s too, so it means nothng. But your reason for liking them is one of mine too, all the hot air leaves the case, and that's a good thing because everything else in there is already warm enough. And actually it will let you OC the core agood amount, too bad it's the memory that's really holding it back, but at that price, there's nothing close, and when you resell it it has everything a checkbox eBayer is looking for, BIG HSF, SM3.0+, HDR+AA, latest tech, then throw in Dual-Link DVI, heck even a photochopper could use it for 2D on a 30"LCD.

Yeah as a buy and resell, I'd say it's a good option for the price. If you were holding on to it for a year or two I'd direct you elsewhere but it seems perfect for your needs.
 
Currently, if you use an X1900 Crossfire card and an X1600XT, you'd be making a mistake as the pixel pipelines on the better card default to those of the weaker card.

You'd be mistaken, because it's not the case. they would both work independantly in their own pixel pipeline configuration, but no in crossfire, which is no different than anyone else's solution. The X1600 would not be for rendering graphics, except in a multi-monitor capacity.

I should have referenced the actual ATI Crossfire FAQ. This is what it says:

6. What happens when you pair a 12-pipe CrossFire Edition card with a 16-pipe card?

A. In this scenario both cards will operate as 12-pipe cards while in CrossFire mode.

http://www.ati.com/technology/crossfire/faq.html

The X1800XL or XT would have been a better example than the X1900XT. The X1900XT has 16 pixel pipelines, but does more shader operations, whereas the X1800 is closer in design to the X1600. Anyways, the above ATI Crossfire information is what I was attempting to explain.

Thanks for information on the current 2 PCIe x16 and 2 PCIe x8 board. I can't wait till the 3 PCIe Crossfire boards arrive with next year's onboard ATI DX10 graphics. I've always felt that onboard graphics X200 and above should work for more than just a second monitor, that it should work as a poor man's Crossfire alongside a single card in a single PCIe board, or that it should work for physics.

As for the Crossfire Masters, I was referring to the driver issue of having a DX10 Crossfire Master with a DX10 card for the second Crossfire, and an older Crossfire card as the physics in the third setup.

ATI's first Conroe Crossfire boards will be 2 PCIe x8 instead of x16, but they'll have the onboard X700 based graphics.
 
yipsl said:
I should have referenced the actual ATI Crossfire FAQ. This is what it says:

6. What happens when you pair a 12-pipe CrossFire Edition card with a 16-pipe card?

A. In this scenario both cards will operate as 12-pipe cards while in CrossFire mode.

Actually you should read the context of the question, that' s 800Pro with and X800XT, not X1600 with X1800/1900.

The X1800XL or XT would have been a better example than the X1900XT.

Neither way works for your explanation, understand that the X1600 is 4 'pipelines' not 12, this is the first reason you're going to have difficulty understanding this. NO X1600 with either the X1800 with X1900 in the context you are crossfiring them.

The X1900XT has 16 pixel pipelines, but does more shader operations, whereas the X1800 is closer in design to the X1600.

You got that backwards, the X1600 is closer to the X1900 design than the X1800 design.

Anyways, the above ATI Crossfire information is what I was attempting to explain.

I know what you were attempting, but you're confused, and your first false assumption was the previous line, the second is that ATi is talking about something othe than crossfiring two cards of the same series (ie X800 with X800 ; X1300 with X1300).

As for the Crossfire Masters, I was referring to the driver issue of having a DX10 Crossfire Master with a DX10 card for the second Crossfire, and an older Crossfire card as the physics in the third setup.

Well there won't be older Crossfire cards at that time, it's either DX9 or DX10, and really like I said before it has to be the same series, so same DX. As for 2 DX10 in Xfire and1 DX9 in physics, it'll be fine because it's not rendering graphics, it becomes a physics unit, and then as for the multi-monitor option it act like the DX9 cards now with older DX7 or older cards, where the game rendering still sticks to DX10, and the other card is DX9L supported for basic multi-monitor output.

ATI's first Conroe Crossfire boards will be 2 PCIe x8 instead of x16, but they'll have the onboard X700 based graphics.

Which matches the above example, DX10+DX9.
 
Damn... now I'm wondering if I should just go ahead with the card I was planning to get:
http://www.newegg.com/Product/Product.asp?item=N82E16814102008.

Or get a lower midrange card like this one:
http://www.newegg.com/Product/Product.asp?item=N82E16814161161

:?

Any 7900GT or whatever the ATI equivlent is are both at the top range of all the human eye can really see.

After that point all the tests,reports,reviews,benchmarks and other propaganda is just to keep the writers busy and people spending money.
 
Any 7900GT or whatever the ATI equivlent is are both at the top range of all the human eye can really see
Um, that all depends on the game. Oblivion hammers a 7900GT at just 1024x768 resolution even with the 7900GT averaging 24fps in this test. We could all see a difference between 10x7 and 16x12 resolution, and clearly can notice when framerates are in the teens.
 
I don't plan to run Oblivion anytime soon.. I'll try to get any GPU demanding game on consoles whenever possible unless it's a FPS.

This system is mainly for FPS and some PC-only titles I've missed out on for the past... 6 years.:lol: So if it can handle FEAR at a widescreen resolution, then I'll be happy.

Pauldh, hows that X1800XT working out for you? That's the card I'm looking to get.:)
 
Actually you should read the context of the question, that' s 800Pro with and X800XT, not X1600 with X1800/1900.

What page are you reading? Here is the relevant sections of the Crossfire FAQ. The info related to a 16 pipeline card defaulting to 12 pipelines when used with a 12 pipeline card seems to apply across the board, not just to the first implementation of Crossfire.

They both operate at their respective clock speeds, but the card with more pixel pipelines seems to lose out. They list all the Crossfire ready cards on this table, not just the older series.

1. What is the difference between ATI’s CrossFire platform and the competitor’s solution?

A. The principal differences between the competitor’s multi-GPU solutions and ATI’s CrossFire are:

a. CrossFire can enable multi-GPU rendering on all applications.
b. CrossFire supports more rendering modes. Supertiling evenly distributes the workload between the two GPUs to improve performance. CrossFire can use multiple GPUs to improve image quality rather than performance with Super antialiasing (AA) modes. Supertiling and SuperAA modes are only supported on the CrossFire platform.
c. CrossFire is an open platform that supports multiple components and graphics cards that can be mixed and matched in a single system. Competitive multi-GPU solutions are constrained to supporting identical graphics cards.

2. What graphics cards work with CrossFire?

A. A complete CrossFire system requires a CrossFire Ready motherboard, a CrossFire Edition graphics card and a compatible standard Radeon (CrossFire Ready) graphics card from the same series, or two CrossFire Ready cards if they are software enabled. This applies to cards from ATI or any of its partners.
Card One --------------- Card Two
Radeon X1900 Series ----Radeon X1900CrossFire Edition
Radeon X1800 Series ----Radeon X1800 CrossFire Edition
Radeon X1600 Series---- Radeon X1600 Series
Radeon X1300 Series---- Radeon X1300 Series
Radeon X850 Series------Radeon X850 CrossFire Edition
Radeon X800, PRO, XL, GTO, XT
or XT Platinum Edition ---Radeon X800 CrossFire Edition

3. What is the difference between CrossFire Ready graphics cards and CrossFire Edition graphics cards?

A. CrossFire Edition graphics cards include a “compositing engine” chip on-board. This chip takes the partially rendered image from the CrossFire Ready graphics card, and merges it with the partially rendered image from the CrossFire Edition graphics card. The result is a complete frame rendered at up to twice the performance of a single graphics card. The CrossFire compositing engine is a programmable chip that offers flexible support of different graphics cards, allows a superior feature set (advanced compositing modes), and enables further enhancements to be quickly implemented on next generation products. The CrossFire compositing engine also offers a performance benefit over combining the final image on the GPU.

4. What motherboard is required for a CrossFire system?

A. CrossFire Xpress 3200, CrossFire Xpress 1600, Radeon Xpress 200 CrossFire, Intel i955X and Intel i975X based dual-slot motherboards are supported platforms.

5. When will CrossFire Ready motherboards be available?

A. CrossFire Xpress 3200, CrossFire Xpress 1600, Radeon Xpress 200 CrossFire, Intel i955X and Intel i975X motherboards are available from our partners now.

6. What happens when you pair a 12-pipe CrossFire Edition card with a 16-pipe card?

A. In this scenario both cards will operate as 12-pipe cards while in CrossFire mode.

7. What happens when your CrossFire Edition card and and a compatible standard Radeon (CrossFire Ready) graphics card have different clock speeds?

A. Both cards will continue to operate at their individual clock speeds.


The question I have is what happens with the X1900 Crossfire card running with an X1600XT? The X1900 has 16 pipelines that do 3 shader operations each, so does it default to 12 with 3 operations each?

The X1800 is based on the R520 core, the X1600 is based on the R520 core and the X1900 is based on the R580, so I stand by my comment that the X1600 is closer in generation and operation to the X1800 than the X1900.

Regardless of what core generation the X1600 is derived from, it only has 12 pixel pipelines and one operation per pipeline. Thus, it does shaders in a manner closer to the X1800 than the X1900.

Note the Newegg description of the ATI X1900XT:

# ATI 102-A52025 Radeon X1900XT 512MB GDDR3 PCI Express x16 Video Card - OEM

Chipset Manufacturer: ATI
Core clock: 625MHz
DirectX: DirectX 9
DVI: 2
Memory Clock: 1450MHz
Memory Interface: 256-bit
OpenGL: OpenGL 2.0
PixelPipelines: 16(48 pixel shader processors)
TV-Out: HDTV/S-Video/Composite Out
VIVO: Yes
# Model #: 102-A52025
# Item #: N82E16814102690

So, I think it's a valid question. An X1900GT with 12 (36) pipelines and an X1900 Crossfire card with 16 (48 )

ATI also didn't make the All in Wonder cards Crossfire ready, which is why I can't use an AIW X1900 with an X1900 Crossfire. That would be an ideal situation for the way I use my PC. Perhaps that will change in with the DX10 All in Wonder cards.

Neither way works for your explanation, understand that the X1600 is 4 'pipelines' not 12, this is the first reason you're going to have difficulty understanding this. NO X1600 with either the X1800 with X1900 in the context you are crossfiring them.

The X1300 is 4 pipelines. You have it wrong. The X1600 is 12 pipelines. Note the relevant info from Newegg:

# HIS Hightech H160XTQT256GDDN Radeon X1600XT 256MB GDDR3 PCI Express x16 IceQ Turbo CrossFire Supported Video Card - Retail

Chipset Manufacturer: ATI
Core clock: 600MHz
DirectX: DirectX 9
DVI: 2
Memory Clock: 1400MHz
Memory Interface: 128-bit
OpenGL: OpenGL 2.0
PixelPipelines: 12
TV-Out: HDTV/S-Video Out
VIVO: No
# Model #: H160XTQT256GDDN
# Item #: N82E16814161160

I can't believe you're getting such basic information wrong. There might be a difference between newer drivers and mixing different generations of Crossfire cards not mentioned in their FAQ, but at least don't say that the FAQ info on defaulting to x number of pipelines only applies to one series, when the FAQ as a whole references all the series. At least get the number of pipelines of the X1600XT series correct.

I've owned both ATI and Nvidia cards over the years (and Diamond Voodoo cards before that) and I try to get the information related to each company's cards correct. I've been mainly an ATI fan since I started using All in Wonder cards in each of my PCs. That's an area where Nvidia just can't compete.
 
I'm a long term Elder Scrolls fan since TES: Arena and IMHO, Oblivion is a FPS. It's closer to a Battlespire 2 than it is the next Daggerfall. I think it will compare favorably to the "Half Life Too" that Ubisoft is making out of the Might and Magic series.

Other people have been arguing that Oblivion isn't a true RPG and isn't a true FPS, so even if it's a chimera in design, it's at least a marketing success. No matter what anyone says about today's Bethsoft, they know how to market.
 
I can't tell, every body has different point of view on this.

All I can tell is that I'll probably get a Radeon X1800XT for around 350$ (canadian) when I'll get my new computer in september. Unless DX10 card are already out. This way I'll have good performances without paying too much. If I'd get a 600+$ VPU, it would be too much for a card that I'll use maybe only 9-12 months before upgrading to DX10.

If I would have more patience, I would wait for december to get my computer with a DX10 card. I might actually do this, I don't know yet. It's only that at some point you do have to buy, or you'll wait forever. I just don't know when is the best time.

That's what you should ask yourself.

My 2 cents :wink:
 
I tried to hold back my criticism before, but seriously dude, you don't know $H1T, and now you're confusing people who don't know that you don't know squat, as well as wasting my ime correcting you, thus Pi$$ing me off in the process!

button45lj.png


What page are you reading? Here is the relevant sections of the Crossfire FAQ. The info related to a 16 pipeline card defaulting to 12 pipelines when used with a 12 pipeline card seems to apply across the board, not just to the first implementation of Crossfire.

The Section that you missing, the first;
a CrossFire Edition graphics card and a compatible standard Radeon (CrossFire Ready) graphics card from the same series, or two CrossFire Ready cards if they are software enabled.

ie X1600 with X1600 , X1800 with X1800, X1900 with X1900.
That you can't wrap your head around that means, there's nothing left to discuss until you do more and more importantly BETTER research. Stop spouting your misconceptions, you're just F'in up people who know as little as you!
fing044nd.gif


The X1300 is 4 pipelines. You have it wrong. The X1600 is 12 pipelines. Note the relevant info from Newegg:

Whatever dude, you're an idjit if you take you informatin from NewEgg, heck they list AGP cards as Crossfire ready! :roll:

I can't believe you're getting such basic information wrong.

I can't believe that you still bother posting your ignorance. I'm right on this and you're wrong.

And if you continue like this then you better get used to it. Now STFU before I beat you with my massive brain!

There might be a difference between newer drivers and mixing different generations of Crossfire cards not mentioned in their FAQ, but at least don't say that the FAQ info on defaulting to x number of pipelines only applies to one series, when the FAQ as a whole references all the series. At least get the number of pipelines of the X1600XT series correct.

Seriously, there's no point in discussing it with you, because you wouldn't know the difference between a pixel pipe and crack pipe, although you sound like you're more familiar with the later.

I've owned both ATI and Nvidia cards over the years (and Diamond Voodoo cards before that) and I try to get the information related to each company's cards correct.

Just because you own it doesn't mean you know anything about it. Just like your average DELL owner.
 
Sure some features are really nice (like using a USB flash drive as system memory!)

Why would you want to use slow memory as system ram?
If you ment BOOT from it (USB),you can do that already.

http://tkc-community.net/Forums/viewtopic.php?t=4206

Vista will be a hog. As it is if you have 2GB of ram then it wants to use up to 940MB and load the whole thing into the system.
Looks like a good idea....make the OS fast by having it all in ram....untill you need that ram.

It has good points and bad points but the main thing I dislike is the DRM locked into it's core.

Yes, that guy knew what he was talking about... a flash memory disk should have considerably lower access times than a conventional hard drive... I believe Vista plans on using that sort of alternate storage to compliment RAM... not replace it. I think M$ envisions it being used for the temp/swap file.
 
From what I read it's primarily the notebook/laptop segment that is forced to do this. It's for the auto archiving, etc.

The flash ram has quicker access (thanks to no spin-up time) but slower transfer rates.

It'll be nice, but once they get the MRAM on the drives then it should be both quick, fast and efficient, eventually likely replacing HDs all together IMO, but that's aways off.
 
Stop being a troll. I used the Newegg data for quick reference. If you want Anandtech, then here it is:

ATI's RV530 aka Radeon X1600

Targetting the upper midrange is the Radeon X1600 (RV530), built on the same 90nm process as the Radeon X1800 (R520). All of these RV530 series and lower are single slot products, according to the roadmaps.

ATI RV530 Roadmap
Card Pipes Std Core Clock Std Memory Power Consumption
X1600 XT 12 600MHz 700MHz 60W
X1600 Pro 12 500MHz 400MHz 40W


Neoseeker

Lastly, direct from ATI

GPU Specifications
Features
157 million transistors on 90nm fabrication process
Dual-link DVI
Twelve pixel shader processors
Five vertex shader processors
128-bit 4-channel DDR/DDR2/GDDR3 memory interface
Native PCI Express x16 bus interface
AGP 8x configurations also supported with AGP-PCI-E external bridge chip
Dynamic Voltage Control

Have fun trolling somebody else, but anyone who yells, insults and swears all the while claiming that an X1600XT is 4 pixel pipelines is either a fool, a deliberate troll of a fanboy of a competing GPU company who just can't get his facts straight out of emotional need.
 
Stop being a troll.

LOL!
Troll would imply I'm wrong. Sofar you're the only one posting incorrect information, and continuing to do so despite sinking in your own mistakes.

Why am I not surprised you're an Anandtech flunkie. :roll:

Twelve pixel shader processors

Which is not a pipeline, why do you think they expressly made it 'processor and not 'pipeline'.
In order to count as a pipeline it must have a texture component, of which is only has 4, therefore 12 shader procxessors, and 4 pipelines. Like I said before.

http://www.tomshardware.com/2005/10/06/ati_enters_the_x1000_promised_land/page8.html

"Radeon X1000 architecture de-couples components of the rendering pipeline."

There are only 4 full pixels can be processed per clock, each pipeline contains 3 pixel shader units and one texture unit.

The X1600 has 12 shadeer processors and 4 texture units, therefore 4 full pipelines. The X1800 has both 16 shader units and 16 texture units, and as such 16 full pipelines, and the X1900 has 48 shader units, and 16 texture units, and as such still just 16 pipelines.

Have fun trolling somebody else, but anyone who yells, insults and swears all the while claiming that an X1600XT is 4 pixel pipelines is either a fool, a deliberate troll of a fanboy of a competing GPU company who just can't get his facts straight out of emotional need.

Anyone who doesn't know the difference between a pipeline and a single pixel shader is an ignroant nÖÖb, especially if they think they can Crossfire an X1600 with and X1800 or X1900, while posting a FAQ proving themselves wrong.

It's quite lame that you try to use your mistaken understand of the difference between a single shader unit and a pipeline to try and defend your original mistakes. Sofar you've only posted a list of sites who made similar mistakes.

Only later did the reviewers finally figure out what the new design was;

"It wasn't until RV530 (Radeon X1600) was fully understood did these descriptions become clearer as to their intent; with a configuration signalled on the roadmap as 4-1-3-2 for RV530 it become clear that the numbers represented the number of "pipelines" (even though ATI's engineering didn't like this terminology), the number of texture units per pipeline, parallel shader ALUs per pipeline and the number of Z samples per pipeline."

http://www.beyond3d.com/reviews/ati/r580/

Read the first two pages of that review and edjumakate yourself, or else go crawl back to Anandtech where you belong!
s10ej.gif



edted typo.
 
anyone who yells, insults and swears all the while claiming that an X1600XT is 4 pixel pipelines is either a fool, a deliberate troll of a fanboy of a competing GPU company who just can't get his facts straight out of emotional need.

He was actually right. It has 12 pixel shaders, but it can render only four of them. This put it at 4 pipeline. Do a seach on Radeon X1600 review and you'll find good explanations of the card specifications. Because of this, it will perform for today's games and upcoming one not so bad. That because pixel calculations for these game are multiple for any single pixel rendered on the screen (which mean they are more complex). But for older games, it will lag behind, because 8 out of 12 pixel shaders units will just have nothing to do.

I'd say it's a not so bad card overall, because older games don't need all these horsepower, and newer games will make good use of it. Still, there are better options available.

My 2 cents. :wink: