8x AGP

Lohr

Distinguished
Apr 15, 2002
4
0
18,510
"In the next few weeks, VIA will be quietly replacing the Northbridge KT333 with the KT333A. This chip is compatible with the pin of the previous model and supports the new AGP 3.0 graphics standard. Compared to the AGP 2.0, there's one significant new feature: using an identical mechanical structure of the AGP slot, AGP 8x graphics cards are supported, and the 66 MHz clock speed remains the same. The only change is that a modified transfer protocol will allow more data to be transferred at this clock speed. However, there are currently no graphics cards available on the market that support AGP 8x."

Well it looks like 8x AGP is finally coming out!
I was going to upgrade this month, but instead i decidd to wait for the KT333a to come out first. my problem is that ther eare no cards that support 8x agp.

my question is this:
does anyone know if a 4x agp card will run faster on 8x than it will on 4x, even though it wasnt designed for 8x?

a friend of mine told me this was true, but he can churn out some BS sometimes...so i tohught i'd ask someone else.

want to make sure i'm not wasting my money on 8x agp if i buy the KT333a and the Geforce 4 ti4600.
 
I don't think you would be wasting you money, since you can always upgrade your graphics card later without needing to replace your MB again.
I doubt that your 4x card will work any better on an 8x AGP Slot, but I'm not a hardware expert.
 
Try running your current card at 2x with 3DMark 2001. You most likely won't see a huge difference, and 3DMark would use the AGP bandwidth more than normal gaming would.
We don't even fully use the bandwidth 4x AGP gives us, it's like ATA133.

<font color=blue>If you don't buy Windows, then the terrorists have already won!</font color=blue> - Microsoft
 
The first AGP8X card announced is those using SIS330 GPU which will go into sale late this month(so they say but i doubt it). This chip is using tile based rendering and has a pixel shader(but not vertex) so it's DX8 partially compliant card. It seems to score higher in 3Dmark2001SE than G4MX440 due to the fact that it can run pixel shader test but whether or not many card manufactures will pick this up is questionalble along with driver quality. And no, AGP4X card won't benefit from AGP8X. AGP4X is version 2.0 whereas AGP8X is 3.0
 
What makes you think it will be a useless upgrade for two years? If it's going to be that long, we might as well just wait for PCI-X. Isn't that supposed to be the replacement for the AGP and everything using the existing PCI?
 
Cards are hardly stressing AGP 4x, so why would AGP 8x increase any card performance if it's not a bottleneck?

The Windows Gods demand money to appease the BSOD! - Rev. Bill Gates
 
But we won't know for sure until the next generation of cards that come out for AGP8x. At least in my dreams they'll double the current frame rates even with all details and 8x AA. :smile:
 
Ah, but it's all in the limits of technology. Right now a graphics chip contains far more transistors than even a processor. Adding more performance usually involves either stepping up the speed or adding additional performance features. If we double the speed of the GPU we must also double the speed of the memory to get double the performance, and the needed bandwidth would finally be around 4x AGP. And the GPU would have to be on the 90nm or smaller die size to do it. And going below 90nm will be at least 2 years away for graphics chips. The 32nm process is probably at lest 4 years away, until we get there, AGP8x will most likely not offer noticable performance gains.

What's the frequency, Kenneth?