GT300 NEWS

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

uncfan_2563

Distinguished
Apr 8, 2009
904
0
19,010
Probably, but i made a decision a while ago not to change my attitude as it was better than most.

If my rules were enforced this would be a far better forum as most forums don't let idiots like rangers run riot, yes you can call me a hypocrite and yes you shoudn't use personal insults and all that but at least i give something of a damn to the content of these forums.

How you can think he is not wrong i do not know, a basic rule of forums usually is not to post an entire article of someone else's work unless a press release or something.

Is it my fault i do not like people just posting as they please bumping other people's posts down which may be worth reading?

Perhaps if people were to use some common sense these forums would not have the dross that accumulates like scum on the surface.

Rangers is part of that scum IMO.

if you dont like it leave. YOUR NOT A FREAKING MOD. u have no right to talk, contact a mod and get him to prosecute, its not ur job. seriously, if soo many people in a single thread are yelling at you, the problem cant be us
 

4745454b

Titan
Moderator
First, yes, Charlie is biased as hell. I'm sure we can pretty much all agree on that. Second, we only need to make one assumption from his article. We should all be able to agree that Nvidia has a tough road ahead. Things they need to do include design new chip, test new (for them) process (40nm), implement new DX level, implement GDDR5/memory controller, etc. The only assumption we need to buy from Charlie's article is whether or not the chip is tapped out or not. If its not, they are screwed. You don't need to believe his time frame, or the issue they will/might have with the tessellator, or anything else he wrote. Just think, if they still have to finalize the chip, they will need to do their test runs first. They need to test those chips to make sure they work as they believe. This will take even more time. And seeing as they have no (publicly) working parts that support DX10.1/11, there is no guarantee that they will get it right the first time.

I'm not saying they won't. They are a big company that has lots of money to throw at this problem. (they have the other more possibly damaging problem of not being able to scale their chips down, thus they might end up limiting the lowend to DX10 only) Seeing as we don't usually get it right the first time, I'm having a hard time believing that everything will go great with their new chips. I for one happen to believe his time frame (assuming its not tapped out already) and think we'll see new chips from them VERY late in the year. The question now is, is it tapped or not?

Edited for spelling.
 

rangers

Distinguished
Nov 19, 2007
1,563
0
19,790
If my rules were enforced you would have been banned for your insults, instead we have to put up with your crap, derogatory remarks, small mindedness, and basically chasing away new members, who maybe think we are all dciks like you
 
Currentrly, the 4770 using the 40nm appears to be solid, tho, what are those yields? and since the 4770 ocees so well, it does blow away the leakage problem, as said. Going to 55 from 65, as Charlie said, did take longer than anyone thought it would. Using huge di make this more a likelyhood, so again, hes right on. The rest? Could be
 

jennyh

Splendid
ATI must be absolutely miles ahead on 40nm. The 4770 is all the proof that is needed tbh.

If Nvidia release the g300 before the end of this year i'll be amazed. There is simply so much that can go wrong, they need a miracle to get this out first time.

On the other hand, Nvidia have done almost nothing in terms of innovation for years. Maybe they are about to pull an ATI r700 kind of trick out of the bag with g300. Truth is, they had better be.
 

MO BEEJ

Distinguished
Apr 14, 2009
51
0
18,630
guys helloooooooooooo WAKE UP. Its not an nvidia vs ati vs intel this writer is fighting back i will not state details but remember fudzilla editor wrote somthin then a lot of web attacked him bla bla bla he is just fighting back cmon HELL i will wait for intel larabee before i buy either could my graphic card be names celeron or atom or i7 hmmmmmmmmmmmmm
!!!!
 
Since the release of the G80 series, weve all been waiting on nVidia for that next killer card. Now, it appears theyre going to be hard pressed just to make a DX11 card in time of W7s release, or even by the end of the year? Whats been going on over in GreenLand?
 

MO BEEJ

Distinguished
Apr 14, 2009
51
0
18,630
btw
i think nvidia is already working on x86 processors maybe thats why there innovation is v. slow unlike ati they already have cpus so the just innovate nvidia has to start from zero
 

jennyh

Splendid
The real question is, what the hell have nvidia been doing with their cash. For their sake they better be sitting on a big mountain of money because they are losing it hand over fist right now and it's gonna get a lot worse before it gets better.
 

jennyh

Splendid



Yep but unfortunately you need this thing called 'x86 licence', and they don't have it. :p
 

MO BEEJ

Distinguished
Apr 14, 2009
51
0
18,630
i think intels gpu will lose to ati nvidia since its fairly FAIRLY new to the market (counting gma and exprees and family chipset as a gpu is BAD) but there is always a but if intel does SDF (scalable DUAL fire) i created that like sli and crossfire intel WILL prevail u cant argue with intell support and DRIVER so we might see 2x or 3x or even 4x the performance all in all w8 for the q1 of 2010 everything will show at least 4 sure
 

jennyh

Splendid
I personally think Intel will buy Nvidia within 2 years, EU permitting. I think Larrabee is a great idea but it is ahead of its time and intel need a really decent gpu for the upcoming fusion battle.

AMD are well behind intel in cpu's, but their igp's are miles ahead. This is why AMD bought ATI - not to make discrete graphics but for the cpu/gpu fusion. Unless intel can suddenly make decent igp's then they will have a huge disadvantage vs AMD for fusion, that is why I think they will buy out Nvidia once they hit rock bottom, which tbh can't be far away.
 
From a few rumors and good guesses, LRB can be shrunk to lessor core amounts, and thus make thier IGPs LRB styled. Going fusion in this manor might give Intel a nice boost indeed. I keep hearing 2.5x what they currently have, which puts them close to the high end currently, until ATI releases its R700 based IGP, then itll still be slow, but fairly close to nVidias IGP
 

successful_troll

Distinguished
May 5, 2009
232
0
18,690



I bet you are enjoying playing games at 20-30fps, 720p and even sometimes sub hd resolutions while struggling with you controller in fps games. Different people, different standards...
 

L1qu1d

Splendid
Sep 29, 2007
4,615
0
22,790
well theres, exclusives, theres consistency in updates and dlc, yes controller for fps isn't that great, but its good to have everything at ur finger tips, i have a 720p tv, so the resolution doesn't affect me.

Plus with the console I have more audiences, seeing as not evey1 knows computers, so most of them choose console, which means better gameplay online for certain games (hawx).

I have 2 285s on a pc that hasn't been touched. :p

Also console games have a resale value, PC games rarely do.

Like you said, different opinions:)

P.S

yes I love playing gt 5 at 60 fps, killzone 2 at 30 fps;) I mean it works, right? :)
 

L1qu1d

Splendid
Sep 29, 2007
4,615
0
22,790
oh i'm not whining at all, I mean I'd jump right back on board to pc, when i see some titles that catch "my" eye.

Just like killing floor. Anything that can play on my laptop is welcome :)

don't worry stranger I'll be back to annoy you as soon as dirt 2 is out;), or SC 2 :) Hopefully by then, We'll all have computers that won't cry when playing crysis:p
 

successful_troll

Distinguished
May 5, 2009
232
0
18,690



Well after this attempt I wish I could say that you "make some valid points" but you really dont.

There are tons of exclusives on PC gaming like starcraft 2, diablo 3, stalker Crysis, the Witcher, Empire Total War etc too many too count. Furthermore if you like RTS and MMORPGS then PC is your only choice.

Due to the high development costs that come from royalty fees and high risk of failure Consle gaming is dominated by overhyped, shallow titles. Developers are afraid to experiment. PC gaming as an open platform is the only place where innovation and creativity from indie developers runs rampant. Add to that unlimited backwards compatibility and the fact that PC enjoys xbox360 only games like Gears of war, mass effect and soon fable 2 etc its easily the best choice.

If you like gamepads you can use any kind of them on your PC so i dont know why you brought that up. But I really cant believe you brought up DLC which is maybe the worse trend to ever come to video gaming. Cutting stuff form release to price them later while PC gamers get unlimited content for free from mods is simply outrageous and a rip off.

I dont know what kind of audienced you are talking about but from my experience from xbox360 live I had to always mute everybody to avoid whiny kiddy voices talking "gangsta". Steam is by far more popular in terms of users compared to psn and thats a fact.

Well developers dont get anything from used games, you are only making greedy brick and mortar owners richer. 1/3rd from Gamestop annual profits came from reselling used console games, huge amounts of money that the developer gets nothing for. Its almost as bas as piracy and you dont want to bring this up....

So... Why are you playing console games again? weird tastes is the only valid reason especially when you have a good rig like you do.