XGI Volari series cards

Spitfire_x86

Splendid
Jun 26, 2002
7,220
0
25,780
<A HREF="http://www.tomshardware.com/technews/20030915_040606.html" target="_new">http://www.tomshardware.com/technews/20030915_040606.html</A>

What do you think about these cards? Are they goinf to be real card available in market or just paper tiger like S3 Delta Chrome or Trident XP4?

----------------
<b><A HREF="http://geocities.com/spitfire_x86" target="_new">My Website</A></b>

<b><A HREF="http://geocities.com/spitfire_x86/myrig.html" target="_new">My Rig & 3DMark score</A></b>
 
Hopefully it will work out. We cold definetly use some more competition in the graphics market. However, knowing their past experiences, it could very well be hype.

Hmmm....dual gpu's. didn't 3dfx try something like? If they can get both gpu's to work fully at the same time, this could give off some good performance figures....

As each day goes by, I hug my 9600Pro just a little tighter.
 
their Volari V8 ultra looks very interesting

Will they be cards that will match Nvidia and Ati cards high end cards eg. FX5900? or are they going to be like Matrox is, a sort of side step into a specialised area?

beacause on paper they are looking good <A HREF="http://www.xgitech.com/products/products_v8.htm" target="_new">http://www.xgitech.com/products/products_v8.htm</A>, though i think we've al learnt no to base or hopes on that as Nvidia has shown us.


<font color=red>If My Dog Had A Face As Ugly As Yours, I'd Shave Its Bum and Walk It Backwards!</font color=red>
 
looks good on paper yes...
But then again so did the Parahelia and its revolutionary 256bit mem interface. *snort*


<b>I am not a AMD fanboy.
I am not a Via fanboy.
I am not a ATI fanboy.
I AM a performance fanboy.
And a low price fanboy. :smile:
Regards,
Mr no integrity coward.</b>
 
very nice on paper.... one thing that concerns me is that they are going to use ddrII and ddr... why both? and isnt ddrII going to give heat issues? especiually dual gpus... must be as big as a ti4200-4600 series card. but i defintely cant wait to seee it in benchmarks.. as well as some price quotes.
 
Um Poo, it was 512-bit.

You could actually predict Parhelia would be weak, as they never had a full DX9 implementation, it bore a low clock speed relatively, and most of all, the unoptimized memory controller running 512-bit inefficiently.

This is quite an interesting GPU XGI has. IIRC SiS made that company.
It really does look strong, especially DUAL GPUs. Memory and core clocks are set properly high. 8 pipelines, 4 VS 2.0, once again set up right. I didn't get the 16 pipe thing.
It is almost like the R300. No words on the memory controller, but the V-Drive thing is interesting. Pre-fetched rendering?
I'd like to see what Dave has on this.

Hope this can actually toast the competition. It certainly looks so far, as it adheres to the R300 design, which was almost fool-proof.

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>Are you ugly and looking into showing your mug? Then the THGC Album is the right place for you!</b></font color=blue></A>
 
WOAH, upon reading the PDF, it actually has 16 pipelines! 😱 Have we arrived at such generation?

<A HREF="http://www.xgitech.com/products/XGIPRPBComp09032003En.pdf" target="_new">http://www.xgitech.com/products/XGIPRPBComp09032003En.pdf</A>

Only thing I don't like, is the website uses "EXTREME" connotation too much. Every feature written looks like it could jump outta the text in your face!
And the BroadBahn memory architecture name, way too religious-like. Don't like it.

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>Are you ugly and looking into showing your mug? Then the THGC Album is the right place for you!</b></font color=blue></A>
 
EXTREME DIARRHEA!!!

Have u read this Maddox article? <A HREF="http://www.maddox.xmission.com/c.cgi?u=xtreme_bullshit" target="_new">Take your extreme marketing and shove it!</A>

"Mice eat cheese." - Modest Mouse

"Every Day is the Right Day." -Pink Floyd
 
Yeah I did. But if only his article revolved around PC Extreme stuff and not just edible products. EXTREME GFX CARD< WOO)))T!!!

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=blue><b>Are you ugly and looking into showing your mug? Then the THGC Album is the right place for you!</b></font color=blue></A>
 
Captain Obvious gives Willy's Link a big thumbs up!

<b><font color=red>Captain Obvious To The Rescue!!!</font color=red></b>
 
one thing that concerns me is that they are going to use ddrII and ddr... why both?
Possibly DDR-II is for the expensive cards, and DDR is for the cheaper cards.

----------------
<b><A HREF="http://geocities.com/spitfire_x86" target="_new">My Website</A></b>

<b><A HREF="http://geocities.com/spitfire_x86/myrig.html" target="_new">My Rig & 3DMark score</A></b>
 
who cares about it's name, and what it calls it's features, i just hope it gives Nvidia AND Ati a bit of competition.

one thing i wanted to ask you guys though, when it says "real DX9" do they mean it as Nvidia did, or are they being truthful here and not using marketing?

<font color=red>If My Dog Had A Face As Ugly As Yours, I'd Shave Its Bum and Walk It Backwards!</font color=red>
 
they should put a 9800XT GPU and a Quadro3000 GPU together

at DX9 9800 will kick arse
at opengl Quadro 3000 will kick arse

😀

Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
 
Ew. That would look worse going in than going out.

<font color=blue>other people's lives
seem more interesting
cuz they aint mine
</font color=blue>
<font color=green>Modest Mouse</font color=green>

TKS
 
I found it funny they called one of their features "Intelli-vision".

Intellivision used to be a console similar to the old ataris and spectrum sinclairs. They've currently re-released the Intelli-Vision as one of those control pads which connect straight into the TV and have 10-25 space invader class games preloaded.
 
What the hell are they going to do with 512mb RAM??? We don't really even need 256mb right now!!! When is this thing supposed to arrive, anyway?

<b>nVidia cheated on me so I left her for ATi. ATi's hotter anyway...</b>
 
Benchmarks, we <b>NEED</b> benchmarks.


.....Oh yeah, and don't forget IQ tests as well.



My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!<P ID="edit"><FONT SIZE=-1><EM>Edited by UFO_WARVIPER on 09/16/03 08:13 PM.</EM></FONT></P>
 
"world's first dual graphics processor solution"

didnt ATi release some card (somethin like the Fury MAXX, not quite sure) with 2 processor on it?

Anyways agree with every1, more competition is better and if its no tiger paper, it can only do good.
Im also wondering about the software side, all that nice hardware wouldnt do any good if it doesnt have the proper software support...
 
I was wondering about Quantum Obsidian 3D SLI, Voodoo2 SLI and Voodoo 5 5500 SLI - all of these solutions use multiple graphics chips. Heck, Voodoo 5 6000 was supposed to be the first quad chip board.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 
yeah exactly , the voodoo series had dual processors way before



a false claim like this even before the initial release makes you wonder about the company

-------

<A HREF="http://www.teenirc.net/chat/tomshardware.html" target="_new"> come to THGC chat~ NOW. dont make me get up ffs @#@$</A>
 
....Oh wait, THey said "Dual Graphics Processors" According to nvidia's coined term, no video card without a T&L unit can be a GPU. So, they are basically saying they are the first to have 2 T&L capable processors on one board. Still though, its pretty tricky like you said to make a claim like that. Who's Nvidia to define the term "GPU" anyways? The definition of GPU is just a rip-off of the old definiaiton of 3D-Accelerator chip, which the original voodoo was capable of taking hte load off the CPU anyways. Nvidia claimed to be the first to "take the load off the CPU" with the introduction of the GeForce GPU just because the chip had a T&L unit. In a sense though however, the T&L unit really did releive <i>additional</i> stress from the CPU.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!