John Carmack is a wanker

mpjesse

Splendid
http://www.dailytech.com/John+Carmack+Speaks+on+DX10+Vista+Xbox+360+PS3+Wii/article5665.htm

Is it just me, or is he out of touch with the rest of the gaming world? He's turning into another American McGee... a victim of his own celebrity status.

On dual core/multicore processors:
“So we’re dealing with it, but it’s an aspect of the landscape that obviously would have been better if we would have been able to get more gigahertz in a processor core. But life didn’t turn out like that, and we have to just take the best advantage with it.”

Ok buddy... most of your peers disagree and Doom3 still sucks.

On DX10:
“All the high-end video cards right now -- video cards across the board --are great nowadays,” he said. “Personally, I wouldn’t jump at something like DX10 right now. I would let things settle out a little bit and wait until there’s a really strong need for it.”

I guess he hasn't seen the Crysis trailers. 75% of all games in development are DX10 titles. The only shite game that isn't is his Quake Wars.

This guy is officially irrelevant IMO.
 
When John Carmack speaks, the industry tends to listen. While it can be argued that his influence today on the gaming industry isn’t as big as it was when nearly every 3D shooter was using one of his Quake engines, he is still regarded as part of the heart of that keeps PC gaming alive. He continues to influence gaming hardware too, especially in the area of graphics. In fact, NVIDIA and ATI consult with John Carmack on design decisions when engineering new GPUs

so who wrote this mr. carmack himself.

he is a snapperhead :lol:
 
Calm down slippery I don't think theres a real need to go starting threads calling him a wanker.

What he said on the dual core issue regarding we should have continued the gigahertz race is crap, but you can sort of see where he's coming from seeing as theres not widespread support for dual core atm, if he said that we should have continued enhancing core architectures as well he might sound less stupid, that quote doesn't deserve any credibilty cause it's just stupid but if you look into it there is some meaning behind it so I think calling him a wanker is inapropriate. I'm not denying that going to dual-core was the right step.

The DX10 quote does make sense though, sure 75% of games in develepment are DX10, but they are in development. Mr Carmack said he would not jump on them right now which makes sense if you are only playing dx9 games and the G80 grunt is more than you need. Forking out top dollar for a high end DX9 card would be stupid but I personally would hesitate on buying a DX10 card atm when I can get a cheap 2nd hand card to hold me until the DX10 cards have been tested in the DX10 environment.

Quake Wars does not look to be 'shite'.

IMO your over-reacting by making a thread about this and calling him a wanker.
 
Well I'd have to disagree with you on that. He's not talking about multi-cores being bad technology, he's talking about the complications of making a game multi-core. His comment on gigahertz is a bit "noobish" but you get his drift.

And about DX10, all he's saying is that there is no need to jump at DX10 CARDS right now - which is true since all games currently out run fine on DX9 cards. He's not condemning DX10 technology - he's simply saying to wait until their actually needed to play games at their full potential (like Crysis).
 
I guess he hasn't seen the Crysis trailers. 75% of all games in development are DX10 titles. The only shite game that isn't is his Quake Wars.

Meh. The Dx9 crysis trailers I've seen look just as good as the Dx10 ones.

The whole Dx10 thing is looking awfully similar to the Dx9 thing when the 9700 PRO was released... everyone was scrambling for Dx9 cards and 2 years later the Dx8 cards were still giving great service because the visual difference between Dx9 and Dx8 was almost undetectable from a player's standpoint.

4 years after the 9700 PRO came Shader Model 3.0, and OpenEXR... the first real tangible visual feature that those older cards couldn't display. And it's still hardly a 'must-have', I mean SM 2.0 cards are still (and I'll reckon will continue to be) viable for a while yet.

I'm still waiting to see a tangible difference between the Dx9 and Dx10 codepath before I jump on the 'must have Dx10' bandwagon...
 
Those wishing to take the plunge into DX10 will also have to do so while upgrading to Windows Vista. Carmack, however, isn’t all that excited about upgrading to the new OS: “We only have a couple of people running Vista at our company. It’s again, one of those things that there is no strong pull for us to go there. If anything, it’s going to be reluctantly like, ‘Well, a lot of the market is there, so we’ll move to Vista.’”

Carmack then said that he’s quite satisfied with Windows XP, going as far to say that Microsoft is ‘artificially’ forcing gamers to move to Windows Vista for DX10. “Nothing is going to help a new game by going to a new operating system. There were some clear wins going from Windows 95 to Windows XP for games, but there really aren’t any for Vista. They’re artificially doing that by tying DX10 so close it, which is really nothing about the OS ... They’re really grasping at straws for reasons to upgrade the operating system. I suspect I could run XP for a great many more years without having a problem with it,” he said.

Then on to the topic of multi-core gaming systems. Carmack has expressed his dislike for multi-cores, but with the two high-powered new generation consoles both making use of multiple cores, it may be something he just has to deal with. He says of the Xbox 360: “Microsoft has made some pretty nice tools that show you what you can make on the Xbox 360 [with the multi-cores] … but the fundamental problem is that it’s still hard to do. If you want to utilize all of that unused performance, it’s going to become more of a risk to you and bring pain and suffering to the programming side,” he laments. “So we’re dealing with it, but it’s an aspect of the landscape that obviously would have been better if we would have been able to get more gigahertz in a processor core. But life didn’t turn out like that, and we have to just take the best advantage with it.”

yeah i like my 68gt but its getting long in the tooth.
it seems carmack is kinda stuck in the past.

he needs to evolve.
 
i have always said that quad core and dx10 wont be mainstream
for atleast another year.

as dualcore will be mainstream for until then and then some.

dx10 and vista will come out soon but probably wont be popular til late summer or after.
 
Well I'd have to disagree with you on that. He's not talking about multi-cores being bad technology, he's talking about the complications of making a game multi-core. His comment on gigahertz is a bit "noobish" but you get his drift.

Well if you love him so much why don't you just marry him? [/childish]
 
I guess he hasn't seen the Crysis trailers. 75% of all games in development are DX10 titles. The only shite game that isn't is his Quake Wars.

Meh. The Dx9 crysis trailers I've seen look just as good as the Dx10 ones.

The whole Dx10 thing is looking awfully similar to the Dx9 thing when the 9700 PRO was released... everyone was scrambling for Dx9 cards and 2 years later the Dx8 cards were still giving great service because the visual difference between Dx9 and Dx8 was almost undetectable from a player's standpoint.

4 years after the 9700 PRO came Shader Model 3.0, and OpenEXR... the first real tangible visual feature that those older cards couldn't display. And it's still hardly a 'must-have', I mean SM 2.0 cards are still (and I'll reckon will continue to be) viable for a while yet.

I'm still waiting to see a tangible difference between the Dx9 and Dx10 codepath before I jump on the 'must have Dx10' bandwagon...

i agree about sm 2.0 as crysis will be single core and sm2.0
capable.
 
Well coding is learning another language. I can understand where he is coming from, I once took a class on C++ (although I didn't like it :? ).
It's like a foreign language that changes constantly according to new demands. But he does seem to be just ranting and whining most of the article :evil: .
 
Lol. This actually the first time I've heard of this guy. All I'm saying is that maybe you misinterpreted(sp?) what he wrote.

his statement suggest he doesnt look towards improvement/upgrade,
evolving.

yes current stuff will last but in the computer/electronic world
you better always be on the ball.

he seems to want to optimize current stuff but that doesnt always work.
 
Carmack could give nerd lessons. I mean, look at his hair do. WTF? Does he get laid? I can't see how. He's worse than Bill Gates.

john_carmack.jpg


That whole eurotrash flowing hairdo style died in the 80's.
 
He probably gay. And he's probably the "girl" in his gay relationship.

PM to SEALboy: Where'd you put the 'General Failure' thing that was in your sig? BEST SIG EVER.
-cm
 
Well I'd have to disagree with you on that. He's not talking about multi-cores being bad technology, he's talking about the complications of making a game multi-core. His comment on gigahertz is a bit "noobish" but you get his drift.

Yes, the process of optimizing for dual cores is a burden on software developers, and it would have been much simpler if the GHz just kept soaring, no complicated optimization needed for a 6GHz single core, it would run faster on its own.

Probably what further worries John is having to optimize further for quad core, octo core, decasexta core... as they release and become mainstream (if that happens). I'm no guru on CPU architecture, but I think a shiteload of cores won't work forever, only so much can be done with parallelism. Atomic computing, now that will lead us to a technological singularity!
 
Carmack could give nerd lessons. I mean, look at his hair do. WTF? Does he get laid? I can't see how. He's worse than Bill Gates.

john_carmack.jpg


That whole eurotrash flowing hairdo style died in the 80's.

Do I detect a *gasp* mullet? Can't quite tell from that angle.
 
I think you're being a little too harsh.

What he said was true at the moment, but hardly profound.

Disabling the second core on an E6300 and OC'ing it to 3.X GHz would have much better gaming performance at the moment than even a Quad core at 1.8GHz. GHz is instant performance with no programming.

Of course a developer would sooner want that. Makes his life much easier; that and consumers don't have to wait for software to take use of it either.

Core 2 isn't the best chip because its a Dual Core. That sure didn't help Pentium D much lol. Its better because its a better design, and adding GHz to it makes it even faster.

Even now with everyone buying dual cores (including myself), gaming has yet to get better. Most people will only really benefit from the huge improvement in video encoding at the moment.

Intel has shown extra cores can do amazing stuff for games, like in Alan Wake. But now we get to hurry up and wait.

As for video cards, he's right too. You don't need an 8800GTX unless you game at above 1600x1200, and by the time DX10 is mainstream in gaming the video cards will be cheaper and faster. (spring refresh for nV and ATi's R600 are right around the corner already....)

9700pros were killer Dx8 cards at launch, but didn't do so hot once we all had Dx9 games. DooM3 (yes, it does suck) ran like crap on 9700pros when it finally came out.

Sure John is not looking so hot these days as other companies are releasing prettier /more advanced engines, but cut him some slack.

He's not too far off base.
 
Carmack could give nerd lessons. I mean, look at his hair do. WTF? Does he get laid? I can't see how. He's worse than Bill Gates.

john_carmack.jpg


That whole eurotrash flowing hairdo style died in the 80's.

Do I detect a *gasp* mullet? Can't quite tell from that angle.

He looks exactly like my computer science teacher, except younger. Man, that is f***ed up. do these guys meet on weekends to discuss style or something?!