Radeon 2 and 3 info @ Digit-life

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Negaverse23

Distinguished
Dec 31, 2007
449
0
18,780
The i810/DC-100/E is basically dead. The new i815p/ep chipsets no longer include integrated graphics.

=
<font color=green><i>Where will your PC be in the next 10 years?</i></font color=green>
 

Negaverse23

Distinguished
Dec 31, 2007
449
0
18,780
They make up for the loss by jacking up the prices of their games.

=
<font color=green><i>Where will your PC be in the next 10 years?</i></font color=green>
 

HolyGrenade

Distinguished
Feb 8, 2001
3,359
0
20,780
Something like that. But every console manufacurer loses money on first year or so of console sale. They Make their maoneyback by taking a cut from the publishers profit for each game sale. Also, they charge the the developers to use their proprietory technology for programming the games. Microsoft, however, will not be charging the developers for devoloping for the XBox. Afterall it is still only C++ using the DirectX API and some assembler optimisations.


<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
 

noko

Distinguished
Jan 22, 2001
2,414
1
19,785
Thats why I don't understand how Microsoft is going to make money on it unless they are depending on there own games to pull them through. Here is something that may interest you, looks like GameCube is more ready then expected:

<A HREF="http://www.msnbc.com/news/571518.asp#BODY" target="_new">http://www.msnbc.com/news/571518.asp#BODY</A>
 

HolyGrenade

Distinguished
Feb 8, 2001
3,359
0
20,780
Microsoft will still be taking a cut from the publishers.


<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
 

rcf84

Splendid
Dec 31, 2007
3,694
0
22,780
Well the RADEON 2 is going to kick A$$. Early Geforce 3 well maybe we will see a RADEON 2 MAXX 128mb or 256mb.

Then you got this monster:
!!! RADEON 3 !!!

The only nice Intel guy.
 

noko

Distinguished
Jan 22, 2001
2,414
1
19,785
Looks like ATI is lining up some heavy artillery, doesn't it? ATI has more then just 3d graphics that will hit the market. The next All-In-Wonder will probably have picture-in-picture as well as digital video effects. Video where you can do uniques transitions between two video streams. The origianl Radeon could take two video streams and wrap one on a 3d object like a sphere or plane but not limited to that and animate it into the back ground while bringing the other in the forground. So doing videos on your computer will be even more easier while being able to do real time effects. The original Radeon could output to a HDTV set directly except ATI decided not to incorporate that capability in the Radeon drivers due to its inability to do the interlace modes on direct output. The Radeon2 will be able to do both progressive and interlance modes on direct output. What does that mean? It means that you will be able to hook up any HDTV to the Radeon and not only watch outstanding DVD but play your favorite games on your 45"-65" HDTV!! Imagine playing QuakeIII, Serious Sam, Unreal2, Max Payne at 1920x1090 or any other 18 resolution HDTV formats sitting in your recliner! How about using your HDTV as your monitor? You will be able to do it. Just think of having a High Resolution 65" monitor where you can browse the internet or do work on!! Doing video editing on a HDTV set would be very easy. The potential is awesome. ATI has been working on a HDTV turner as well so maybe that will also be incorporated into the All-In-Wonder. If you look at ATI progress in Video you see their intent of bringing the two worlds together. Nvidia lack of progress in video will hurt them as time goes on. Just like most chips the Radeon is a work in progress. The Radeon2 will use more of those capability and the future looks very bright for ATI. The product line will be untouched by anyone. Bye bye Nvidia. Just kidding :smile: .

<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 05/14/01 12:33 PM.</EM></FONT></P>
 

HolyGrenade

Distinguished
Feb 8, 2001
3,359
0
20,780
Thats great but, If you could afford a 65" HDTV, you probably could afford a professional Video editing suite.


<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
 

noko

Distinguished
Jan 22, 2001
2,414
1
19,785
HDTV prices are dropping and will continue to do so. Plus doing editing professioanly your talking $10,000 plus for a semi professional setup just for the equipment not including the software. HDTVs go for $1000 and up but prices are dropping presently.

<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 05/14/01 11:25 PM.</EM></FONT></P>
 

HolyGrenade

Distinguished
Feb 8, 2001
3,359
0
20,780
Whoa! In the UK a normal 32" Sony FD Triniton Wega would put a hole the size of £1500-£2000. (One of the models does however have VGA Inputs, so you can connect your "Sony VAIO" to the TV.) Also they're all duel tuner with channel scan and frame scan and all the other bells and whistles.



<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
 

noko

Distinguished
Jan 22, 2001
2,414
1
19,785
Thats cool, but not all HDTV's have VGA inputs. In fact very few have VGA inputs. Hopefully that will change but that is not the case now. For all the owners that have HDTV's now, most do not have VGA inputs. So the Radeon2 will be unique, you won't have to buy another HDTV to use. So what is your point? Buy only a VGA input capable HDTV vice buying the one you really want or can afford? Does those HDTV do editing with digital video effects? The Radeon2 can. The Radeon2 is a new breed of card, it won't just be a fancy fast game card (much faster then a GF3 junk) but a multi media splender :lol: .

<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 05/14/01 01:40 PM.</EM></FONT></P>
 

HolyGrenade

Distinguished
Feb 8, 2001
3,359
0
20,780
I think you got me wrong there. These are normal PAL TV's, but they do support ntsc and other pal-variation videos. They're not HDTV's. They're normal CRT TV's though they are perfectly flat, horizontally and vertically.

What I was saying is, they are a bit expensive.

The Radeon 2 would seem great for a personal video editing suite. I suppose it would be going against matrox products. What would be great is digital video support using IEEE-1394 connections, either via ports on the card itself or on them motherboard or a seperate card.

But I'm not sure if the Radeon 2 will actually be faster than, the GeForce 3. ATI did say the Radeons will be faster than GeForce 2s. Nonetheless it would seem it is likely to be a good overall card.


<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
 

rcf84

Splendid
Dec 31, 2007
3,694
0
22,780
The radeon 2 is 250mhz. The geforce3 ultra could be around 250mhz it could get ugly. Also s3 columbia is coming out. Will ATi is my pic. They have the ace and its called "all-in-wonder"

The only nice Intel guy.
 

noko

Distinguished
Jan 22, 2001
2,414
1
19,785
Sorry, I thought we where talking about HDTVs. Here in the States HDTVs are getting more common, still kinda expensive but the prices are rapidily coming down. Look at the below benchmark of Giants comparing a GF3, Radeon, GF2, mx and TNT. You will be surprised:

<A HREF="http://www.aceshardware.com/articles/reviews/GF3vsK2_Part2/GiantsCPU.gif" target="_new">Benchmark GF3/Radeon</A>

A very modern game with alot of features being used from the hardware. The GF3 beats the Radeon only by 1 FPS!! on a T-Bird 1.33ghz, <b>on a Duron 850 the Radeon ties the GF3!!!</b>

This maybe just a problem with that one game and doesn't mean nothing but wait, check this out:

<A HREF="http://www.beyond3d.com/reviews/nvidia_gf3/visiontek/benchmarks/serioussam/ssam_noaachart.jpg" target="_new">Benchmarks of Serious Sam, without any Anti-Aliasing, 64-Tap anisotropic in Blue</A>

Three series of test was run at Beyond3d.com. Look at 1024x768 32bit Extreme. Also notice how dramatic the GF3 performance is hit when quality settings are upped. What is interesting to know at this setting the GF3 is doing 8:1 Anisotropic Filtering while the Radeon is doing a higher 16:1 Anisotropic Filtering, both doing 64-tap (meaning taking 64 samples from the source texture for each pixel being applied to the output mapped texture.) My Radeon with a slower T-Bird does 41.8 FPS with the exact same settings. The GF3 in this case is doing Radeon speeds. Yes the GF3 can produce allot of FPS with crappy output, but for it to do high quality output its frame rate is drastically reduced to the level of the Radeon. In this test I did not overclock my stock 182.25mhz speed.

I could lead you to other sites if you wish but when the GF3 is producing Radeon quality images its frame rate dives and matches the Radeon. The Radeon 2 will have two more pipes and will be about 70mhz faster with much faster ram, better HyperZ etc.. <b>The Radeon 2 will quite frankly toast the GF3!!</b>
 

HolyGrenade

Distinguished
Feb 8, 2001
3,359
0
20,780
C'mon, this is a cheapshot I would expect from someone like powervr2 (no offence powervr2 :wink: ).

The GeForce 3 performs brilliantly on virtually all the benchmarks. The image qualities are also brilliant, except for the persisting S3TC texture degredation, in transleucent textures.

You cant compare the GeForce 3 with the Radeon. That is just vulgar man!
The GeForce 3 is a truly superior card. Do you think the Radeon will be able to hold its own when all the High poly shader games that will be released in the near future. The Radeon 2 will do ok, but we'll still have to wait and see if it is capable of outperforming the GeForce 3. Also, when it is released, the nv30 will be near its release date... so we'll see what happens with the pricing. The GeForce 3 is likely to have a few price drops by then. besides, wait till the det12.xx drivers are released. They have quite a bit of GF3 specific code.

As for the Radeon 64, it is GeForce 2 GTS class hardware. In anandtech, doesn't it score on par with the GTS? This is where it was supposed to pull away from the nVidia cards with its HyperZ. It doesn't happen though.

I'm not totally convinced about nVidias Z-Occlusion Culling techniques, just as I have doubts about HyperZ. nVidia has implemented fast Z-buffer clearing, early-Z-checking and Z-culling. The most effective method which is Z-Culling Query, is not supported in DX8 nor in OpenGL 1.2. They don't even have the OpenGL extensions for it in their Det6.50. I think, we have to wait till 12.xx.

Talking about HSR, Tile based renderring also falters when it comes to 3D textures and some other features I cant remember about. Just thought I'd mention that since I've started ranting.

Oh well! I think I should stop now.


<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
 

noko

Distinguished
Jan 22, 2001
2,414
1
19,785
Not cheap at all, backed by testing unless you want to ignore testing results. GF3 is surpose to be the next generation of card. Here we have a lowly Radeon keeping up with it. Lets all bow down to nVidia, ahmmmmmmmm - - ahmmmmmmmmm - - - ahmmmmmmmmm. <b>NOT!!!</b>. Do you want to see more mediocre results? I am not arguing that the GF3 can pump out FPS, I am arguing when the GF3 is producing high quality images like the Radeon its performance dives hard. REAL HARD. Here are some Radeon images doing its thing at Max Anisotropic filtering, all are from the Radeon:

<A HREF="http://home.cfl.rr.com/noko/shot.jpg" target="_new">A</A>
<A HREF="http://home.cfl.rr.com/noko/shot1.jpg" target="_new">N</A>
<A HREF="http://home.cfl.rr.com/noko/shot2.jpg" target="_new">I</A>
<A HREF="http://home.cfl.rr.com/noko/shot3.jpg" target="_new">S</A>
<A HREF="http://home.cfl.rr.com/noko/shot4.jpg" target="_new">O</A>
<A HREF="http://home.cfl.rr.com/noko/shot5.jpg" target="_new">T</A>
<A HREF="http://home.cfl.rr.com/noko/shot6.jpg" target="_new">R</A>
<A HREF="http://home.cfl.rr.com/noko/shot7.jpg" target="_new">O</A>
<A HREF="http://home.cfl.rr.com/noko/shot8.jpg" target="_new">P</A>
<A HREF="http://home.cfl.rr.com/noko/shot9.jpg" target="_new">I</A>

The Radeon is pumping out around 45-85 FPS doing these images. Do you want comparison shots between the Radeon and the GF3? You might be surprised again.

<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 05/14/01 09:27 PM.</EM></FONT></P>
 

OzzieBloke

Distinguished
Mar 29, 2001
167
0
18,680
The possums are at it again... blimey!

At least noko is giving some numbers and pictures to look at here...

But wasn't the GF3 all about improving visual quality, and not frame rates? If this is the case, and the Radeon original is on par for visual quality, then I would think the Radeon 2 is going to make the GF3 hurt. I don't know how bad, but hurt it will, at least until they bring out the "GF3 ultra" or "GF3 super" or "GF3 I creamed my undies" or whatever it is they are producing.

Cow with legs spread wide either dead or playing 'cello.
 

HolyGrenade

Distinguished
Feb 8, 2001
3,359
0
20,780
<b>16:1 anisotropy and 8:1 anisotropy, both at 64 tap</b>

What did you mean there? did you mean 64 tap anisotrpy with 16 sample trilinear?

I didn't quite get you there. And oh yeah, hold the excitement until the product is released.


<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
 

HolyGrenade

Distinguished
Feb 8, 2001
3,359
0
20,780
nVidia is releasing nv17 which will be a budget version of the GeForce 3. It will be priced at around $150 and is set to outperform the GF2 GTS and PRO. It will be TwinView. Testing cards should be available after june closely followed by the benchmark cards, while consumer versions are set for an autumn release. This chip will also be integrated in the nVidia Crush based motherboards, but I'm thinking these ones will probably have just one monitor output. The motherboards in the UK will be about £120.



<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>