Evga 6800GT for sale

jean

Distinguished
Apr 19, 2004
268
0
18,780
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

"Jean" <nospam@nospam.ca> a écrit dans le message de news:ydUHc.69896$_p5.1438282@wagner.videotron.net...
>
> please reply to sirocco4@hotmail.com
>
>
>
>
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

On Sat, 10 Jul 2004 13:25:26 -0400, "Jean" <nospam@nospam.ca> wrote:

>
>"Jean" <nospam@nospam.ca> a écrit dans le message de news:ydUHc.69896$_p5.1438282@wagner.videotron.net...
>>
>> please reply to sirocco4@hotmail.com
>>
>>
>>
>>
>
>

If I wait long enough, will it drop to $200cdn ?
( Might be worth a drive to the border then )

You must have a very good reason indeed for dumping the card.
Smoke, flames, doesn't bring you your breakfast tea ? Please tell.

Seems just a little early to dump it, since both Ati X800 and
nVidia 6800 prices will be falling steeply over the next few
months, so if you want to swap to Ati X800 later, you will
probably not lose financial ground.

You do know that the 6800 driver versions are still all beta
and still not part of the universal Detonator releases ? And Ati
is having fun and games with really lousy QC on their latest
drivers. Echoes of a couple of years ago. For example,
see the threads on Thief3.

John Lewis
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

> If I wait long enough, will it drop to $200cdn ?
> ( Might be worth a drive to the border then )
>
> You must have a very good reason indeed for dumping the card.
> Smoke, flames, doesn't bring you your breakfast tea ? Please tell.

I suspect he got a 'dud', aka poor overclocker, since he won't give any
performance numbers.

Jeff B
 

jean

Distinguished
Apr 19, 2004
268
0
18,780
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

no it overclock well at 400mhz 1100mhz

but problem with refresh rate and poor Far Cry performance...

I prefer ATI


"Jeff B" <fake@addy.com> a écrit dans le message de news:iKZHc.63893$XM6.36024@attbi_s53...
>
> > If I wait long enough, will it drop to $200cdn ?
> > ( Might be worth a drive to the border then )
> >
> > You must have a very good reason indeed for dumping the card.
> > Smoke, flames, doesn't bring you your breakfast tea ? Please tell.
>
> I suspect he got a 'dud', aka poor overclocker, since he won't give any
> performance numbers.
>
> Jeff B
>
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

On Sat, 10 Jul 2004 18:14:47 -0400, "Jean" <nospam@nospam.ca> wrote:

>no it overclock well at 400mhz 1100mhz
>
>but problem with refresh rate

> and poor Far Cry performance...

Upgrade your CPU ( when you can ) !!!. The higher levels of AI,
the floating-point operations with binoculars on, and several other
internal game-engine activities can drastically drive the frame rate
down due to CPU loading. Verify CPU loading by periodically
checking your CPU ( not GPU ! ) temperature while the game
is running. Use AIDA or similar running on desktop to monitor
CPU sensor temp before starting Far Cry and then Alt-Tab in
and out of the game to instantaneously view the CPU temp.

I have a P4 running at 3.35GHz and the CPU temperature
measurements on Far Cry were very revealing.

You seem to be throwing the baby out with the bathwater, but you
obviously have a great deal of money anyway if you decided to
buy either a 6800GT or X800 XT PE at this time.

Its Summertime in Canada where it is short enough that it is a pity
to miss a great time outdoors and forget video cards until Fall
when many driver issues ( both Ati and nVidia ) will be resolved
and prices of the high-end video cards are likely to be far more
reasonable..

>I prefer ATI
>

OK, I believe that Canada is a fairly free country. Then why
did you buy the 6800GT in the first place ? Hopefully you will be
happy with your alternate purchase and won't be wailing about
lack of DX9.0c hardware features in a few months time.

John Lewis
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

> Upgrade your CPU ( when you can ) !!!. The higher levels of AI,
> the floating-point operations with binoculars on, and several other
> internal game-engine activities can drastically drive the frame rate
> down due to CPU loading. Verify CPU loading by periodically
> checking your CPU ( not GPU ! ) temperature while the game
> is running. Use AIDA or similar running on desktop to monitor
> CPU sensor temp before starting Far Cry and then Alt-Tab in
> and out of the game to instantaneously view the CPU temp.
>
> I have a P4 running at 3.35GHz and the CPU temperature
> measurements on Far Cry were very revealing.

I don't get it. What's the point of monitoring CPU temp
since the game is running properly? CPU temp is what it is, and does
not affect
game performance, so how does it help knowing your CPU temp?

> forget video cards until Fall
> when many driver issues ( both Ati and nVidia ) will be resolved

What are these "driver problems" you keep refering to in your posts?
Please tell us since nobody else is aware of them.

Jeff B
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Jean wrote:

> no it overclock well at 400mhz 1100mhz
>

These are good numbers. You should hang onto the card until
the v1.2 patch comes out so you can use the 3.0 shader assuming you
have already upgraded to Dx9.0C. It will give you better performance.
On the other hand, the 3dc technology which ATI has and nvidia
doesn't is reason enough to go with ATI.

Jeff B
 

jean

Distinguished
Apr 19, 2004
268
0
18,780
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

You know John...you're right...it's summertime :) btw I have a P4 3.2c and watercooling (Exos) on
both CPU and GPU...So temp was never a problem here... My GPU temp is 47-49c idle and 51c after
playing... I didn't even try to push the GT further 400mhz + 1100mhz but I bet I could easily.

The point is Nvidia does not comply to DVI so the lost of resolution and refresh rate...

http://www.extremetech.com/article2/0,1558,1367918,00.asp

I didn't know that until I bought the GT... I might order an X800XT to compare and get a refund If I
prefer the GT...


Canada is not only a free country but also home of the best video cards creators :)






"John Lewis" <john.dsl@verizon.net> a écrit dans le message de news:40f09345.3558508@news.verizon.net...
> On Sat, 10 Jul 2004 18:14:47 -0400, "Jean" <nospam@nospam.ca> wrote:
>
> >no it overclock well at 400mhz 1100mhz
> >
> >but problem with refresh rate
>
> > and poor Far Cry performance...
>
> Upgrade your CPU ( when you can ) !!!. The higher levels of AI,
> the floating-point operations with binoculars on, and several other
> internal game-engine activities can drastically drive the frame rate
> down due to CPU loading. Verify CPU loading by periodically
> checking your CPU ( not GPU ! ) temperature while the game
> is running. Use AIDA or similar running on desktop to monitor
> CPU sensor temp before starting Far Cry and then Alt-Tab in
> and out of the game to instantaneously view the CPU temp.
>
> I have a P4 running at 3.35GHz and the CPU temperature
> measurements on Far Cry were very revealing.
>
> You seem to be throwing the baby out with the bathwater, but you
> obviously have a great deal of money anyway if you decided to
> buy either a 6800GT or X800 XT PE at this time.
>
> Its Summertime in Canada where it is short enough that it is a pity
> to miss a great time outdoors and forget video cards until Fall
> when many driver issues ( both Ati and nVidia ) will be resolved
> and prices of the high-end video cards are likely to be far more
> reasonable..
>
> >I prefer ATI
> >
>
> OK, I believe that Canada is a fairly free country. Then why
> did you buy the 6800GT in the first place ? Hopefully you will be
> happy with your alternate purchase and won't be wailing about
> lack of DX9.0c hardware features in a few months time.
>
> John Lewis
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Jean wrote:

> You know John...you're right...it's summertime :) btw I have a P4 3.2c and watercooling (Exos) on
> both CPU and GPU...So temp was never a problem here... My GPU temp is 47-49c idle and 51c after
> playing... I didn't even try to push the GT further 400mhz + 1100mhz but I bet I could easily.
>
> The point is Nvidia does not comply to DVI so the lost of resolution and refresh rate...
>
> http://www.extremetech.com/article2/0,1558,1367918,00.asp
>

I connected my BFG 5600U to my Apple 23" LCD monitor, didn't work AT ALL
although it worked perfectly with my Sammy 19" LCD screen. The faiure
Had nothing to do
with res cause during POST the res is only 640x480.
Tomorrow I'll take possession of my new BFG6800u OC (Bestbuy.com))
and I'll be testing it on both my Apple display and my 61" Samsung
DLP HDTV set (DVI port). I'm very anxious to see if the 6800u works
with the
Apple. (My x800pro worked perfectly with all displays mentioned above).

Jeff B
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

On Sun, 11 Jul 2004 05:28:49 GMT, Jeff B <fake@addy.com> wrote:

>
>> Upgrade your CPU ( when you can ) !!!. The higher levels of AI,
>> the floating-point operations with binoculars on, and several other
>> internal game-engine activities can drastically drive the frame rate
>> down due to CPU loading. Verify CPU loading by periodically
>> checking your CPU ( not GPU ! ) temperature while the game
>> is running. Use AIDA or similar running on desktop to monitor
>> CPU sensor temp before starting Far Cry and then Alt-Tab in
>> and out of the game to instantaneously view the CPU temp.
>>
>> I have a P4 running at 3.35GHz and the CPU temperature
>> measurements on Far Cry were very revealing.
>
>I don't get it. What's the point of monitoring CPU temp
>since the game is running properly? CPU temp is what it is, and does
>not affect
>game performance, so how does it help knowing your CPU temp?
>

Sorry, you totally missed the point. Comparing CHANGES in CPU temp
with those on the GPU is an excellent way of judging whether a drastic
frame-rate drop is due to the GPU or CPU. In Far Cry, for example,
when the binoculars are turned on, the frame-rate drops drastically
and the CPU temp rockets up 5-10 degrees C.. The core-temp
on my O/C'd FX5900 does not change at all with binocs. on. The
frame-rate here is clearly being limited by CPU activity - rescale
floating-point ones.

John Lewis
 

jean

Distinguished
Apr 19, 2004
268
0
18,780
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

Ok then...will overclock the CPU to 3.46 see the difference.


"John Lewis" <john.dsl@verizon.net> a écrit dans le message de news:40f1f8fe.1861283@news.verizon.net...
> On Sun, 11 Jul 2004 05:28:49 GMT, Jeff B <fake@addy.com> wrote:
>
> >
> >> Upgrade your CPU ( when you can ) !!!. The higher levels of AI,
> >> the floating-point operations with binoculars on, and several other
> >> internal game-engine activities can drastically drive the frame rate
> >> down due to CPU loading. Verify CPU loading by periodically
> >> checking your CPU ( not GPU ! ) temperature while the game
> >> is running. Use AIDA or similar running on desktop to monitor
> >> CPU sensor temp before starting Far Cry and then Alt-Tab in
> >> and out of the game to instantaneously view the CPU temp.
> >>
> >> I have a P4 running at 3.35GHz and the CPU temperature
> >> measurements on Far Cry were very revealing.
> >
> >I don't get it. What's the point of monitoring CPU temp
> >since the game is running properly? CPU temp is what it is, and does
> >not affect
> >game performance, so how does it help knowing your CPU temp?
> >
>
> Sorry, you totally missed the point. Comparing CHANGES in CPU temp
> with those on the GPU is an excellent way of judging whether a drastic
> frame-rate drop is due to the GPU or CPU. In Far Cry, for example,
> when the binoculars are turned on, the frame-rate drops drastically
> and the CPU temp rockets up 5-10 degrees C.. The core-temp
> on my O/C'd FX5900 does not change at all with binocs. on. The
> frame-rate here is clearly being limited by CPU activity - rescale
> floating-point ones.
>
> John Lewis
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

"Jean" <nospam@nospam.ca> wrote in message
news:W%eIc.20573$UO6.205601@wagner.videotron.net...
> You know John...you're right...it's summertime :) btw I have a P4
3.2c and watercooling (Exos) on
> both CPU and GPU...So temp was never a problem here... My GPU temp is
47-49c idle and 51c after
> playing... I didn't even try to push the GT further 400mhz + 1100mhz but
I bet I could easily.
>
> The point is Nvidia does not comply to DVI so the lost of resolution and
refresh rate...
>
> http://www.extremetech.com/article2/0,1558,1367918,00.asp
>
> I didn't know that until I bought the GT... I might order an X800XT to
compare and get a refund If I
> prefer the GT...
>
>
> Canada is not only a free country but also home of the best video cards
creators :)


ROFL. Nice obvious BIAS moron. The DVI problem is a driver issue as
confirmed by BFG. If you actually *had* a BFG GT you'd know that. A fix is
in the works, moron. "NVIDIA does not comply to DVI" <-- ROFL!!!

ATI is not "the best video card creator". Keep losing.
 

jean

Distinguished
Apr 19, 2004
268
0
18,780
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

like DVI was release yesterday....

geez go back playing with your Geforce 4 MX.


"ccs" <temp@no.com> a écrit dans le message de news:dCCIc.19805$yc.15953@fed1read06...
>
> "Jean" <nospam@nospam.ca> wrote in message
> news:W%eIc.20573$UO6.205601@wagner.videotron.net...
> > You know John...you're right...it's summertime :) btw I have a P4
> 3.2c and watercooling (Exos) on
> > both CPU and GPU...So temp was never a problem here... My GPU temp is
> 47-49c idle and 51c after
> > playing... I didn't even try to push the GT further 400mhz + 1100mhz but
> I bet I could easily.
> >
> > The point is Nvidia does not comply to DVI so the lost of resolution and
> refresh rate...
> >
> > http://www.extremetech.com/article2/0,1558,1367918,00.asp
> >
> > I didn't know that until I bought the GT... I might order an X800XT to
> compare and get a refund If I
> > prefer the GT...
> >
> >
> > Canada is not only a free country but also home of the best video cards
> creators :)
>
>
> ROFL. Nice obvious BIAS moron. The DVI problem is a driver issue as
> confirmed by BFG. If you actually *had* a BFG GT you'd know that. A fix is
> in the works, moron. "NVIDIA does not comply to DVI" <-- ROFL!!!
>
> ATI is not "the best video card creator". Keep losing.
>
>
 
G

Guest

Guest
Archived from groups: alt.comp.periphs.videocards.nvidia (More info?)

"Jean" <nospam@nospam.ca> wrote in message
news:b9JIc.64963$wQ5.956302@weber.videotron.net...
> like DVI was release yesterday....
>
> geez go back playing with your Geforce 4 MX.
>
>
> "ccs" <temp@no.com> a écrit dans le message de
news:dCCIc.19805$yc.15953@fed1read06...
> >
> > "Jean" <nospam@nospam.ca> wrote in message
> > news:W%eIc.20573$UO6.205601@wagner.videotron.net...
> > > You know John...you're right...it's summertime :) btw I have a P4
> > 3.2c and watercooling (Exos) on
> > > both CPU and GPU...So temp was never a problem here... My GPU temp is
> > 47-49c idle and 51c after
> > > playing... I didn't even try to push the GT further 400mhz + 1100mhz
but
> > I bet I could easily.
> > >
> > > The point is Nvidia does not comply to DVI so the lost of resolution
and
> > refresh rate...
> > >
> > > http://www.extremetech.com/article2/0,1558,1367918,00.asp
> > >
> > > I didn't know that until I bought the GT... I might order an X800XT
to
> > compare and get a refund If I
> > > prefer the GT...
> > >
> > >
> > > Canada is not only a free country but also home of the best video
cards
> > creators :)
> >
> >
> > ROFL. Nice obvious BIAS moron. The DVI problem is a driver issue as
> > confirmed by BFG. If you actually *had* a BFG GT you'd know that. A fix
is
> > in the works, moron. "NVIDIA does not comply to DVI" <-- ROFL!!!
> >
> > ATI is not "the best video card creator". Keep losing.
> >
> >
>
>

Actually I'm running a REAL GT, unlike you. Quit top posting you newbie.