OpenGL 3 & DirectX 11: The War Is Over

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Do you really need a better looking game?

Do us a favor and play your Pony SlayStation games. You ask why when there are pss2 and pss3? Well, you don’t really need better looking games do you?

And the entire graphic industry that has been evolving 4x faster than the other hardware around it for the past 15 years, should also think this. They’ve done enough- its no holodeck yet, but crisys is close enough. We should just stop innovating new ways to display superior quality, we should go back to reading books and using our imagination.

Not that bad of a suggestion, really.

No way its going to happen anyway.

I have this question for a long time: How about a OpenGL Fork?

What is that?
 
The incredible arrogance of Jobs has a lot to do with this. He even stated recently that he doesn't care about gaming. And since Apple is the only rich company that could have a vested interest in making OpenGL work better, they not only lose out to gaming but also to other 3D apps, like CAD and Maya.

No one is rich enough, or cares enough, to make Linux a real competitor. There is more of a chance that a socialist country (China) will do this than IBM.

Meanwhile MS invests billions to (try to) improve their products and maintain their dominance. So, as much as we wish they would have added DX10 to XP, there isn't any incentive for them to do so. Goliath can do what he wants when David is spending all his time jerking off.
 
There are some rumors going around in MS OS development circles about Microsoft considering doing away with the Win32 kernel in Windowns 8. MS is working on Windows 7 and also has some kind of Pilot project development looking at a new kernel for Windows 8. They are dreaming of a new MS kernel built from the ground up which is leaner and more object oriented. I dont really know how this would deal with what they use for the Hardware interface level i.e. Device Drivers. Then there is Direct X? Who knows what they are thinking.

Remember dreaming about and planning new development is often just empty promises. However, I would not be surprised that they are thinking how they can improve security before they get a new gaming interface working.
 
I think i read in this very article that both HD consoles use d3d-
They both use directx 9 level hardware. That's not the same thing as saying they both use DirectX. The PS3 does not.
and that would make so much more sense- have you seen ps3 games? They look good- no way OGL is driving that.
It is. And by your own admission the games look fine.
 
Until a couple of years ago I'd never much liked the PC as a
games platform, or as a work platform either (my main system is
an SGI) though I do now use a PC for video encoding. For gaming I
bought a PS2 and enjoyed the games a lot, especially the later
releases such as Black which had surprisingly nice visuals. I was
looking forward to the PS3, with titles like, "Resistance: Fall
of Man", GTA IV, etc. I like 1st-person shooters such as
Mercenaries, Call of Duty, etc., and highly explorative/flexible
'open' games such as "Draken: The Ancients Gate", Summoner 2,
the GTA series, etc. Plenty of games on the PS2 of this type and
I certainly reveled in never having to worry about OS nonsense,
driver bugs, virus hell, etc. Game crashes/freezes are very rare.

But in the end, once the PS3 launched, I was not impressed at the
high pricing (450 UKP in the UK, ie. $900) and the high cost of
the games (50 UKP was typical, ie. $100), so I didn't buy one.

I already had a XEON Dell Precision 650 I was using for general
work stuff (an AGP system) which didn't actually cost me anything
(bought two, sold one at a profit that covered both) so I decided
after reading lots of online articles on toms, anandtech, etc.
to replace the existing Quadro4 900XGL with a modern gfx card,
namely an X1950 Pro AGP (this was in Dec/2006). I was very
impressed with the results and was delighted at being able to
easily play exactly the kind of game I really like (I bought
Oblivion and Stalker). A few months later I sold the Dell and
used the money to more than cover the cost of a new set of base
parts, keeping the X1950 card and the disks (4 x 147GB 15K U320
SCSI). The mbd was cheap (Asrock AM2/DDR2-800 AGP, only 35 UKP),
the CPU purchase was perfectly timed (Athlon64 X2 6000+, 156 UKP,
the very week the price was halved by AMD so it was massively
cheaper than an equivalent Core2Duo), RAM was much cheaper by
then (I bought 4GB), got a nice case (Centurion Plus 534),
reasonable PSU, and the results were fantastic. The faster RAM
(DDR2/800 instead of DDR266) enabled framerates as much as 6X
better than with the same gfx card used in the Dell. I ended up
getting better results than online review sites were seeing with
the PCIe version of the X1950. 😀 Indeed, for an Athlon64 X2
system with an X1950 card, I had the no. 6 spot on 3DMark06 (5583).

Finally this year for my bday (so the cost to me was zero), I
switched mbds again to an ASUS M2N32 WS Pro (so I could have
proper PCIX and PCIe SCSI RAID) and replaced the gfx with
a GF8800GT PCIe (gf bought me CoD4 for xmas, still not yet used).
Now the games run with all detail settings maxed-out at 2048x1536
(using no AA but 16A AF). I'm really pleased with the results,
and I have CoD4 sitting unopened for when I feel I've done
sufficiently more of the games I already have.

Do I miss having a PS3? No. It would be nice to play GTA IV,
but I don't have time to cope with more than 2 games at any
one time and it'll be ages before I'm bored with Oblivion and
Stalker. Heck, I still have PS2 games I haven't finished, and
a couple I've not even started (my brother bought me several
way back).

In the past I was never interested in PCs as a gaming platform.
Friends I knew who did use PCs always seemed to be spending time
reinstalling Win98, fighting virus woes, constantly upgrading,
etc. But things have changed; XP is a good platform for gaming,
and by not always getting the very latest gfx/CPU every time
something new comes out, one can stay nicely up to date without
breaking the bank (I'd say upgrade once every 18 months at most,
aim for the 100 to 150 UKP price point for the gfx). In my case,
through a bit of luck, I've been able to maintain a good system
at no net cost at all, whereas if I'd bought a PS3 for 450 UKP
I'd be pretty peaved at the degree of devaluation and changes
in disk capacity, etc.

My 1st console was actually an N64 (I had the 1st ever web site
on the Ultra64 as it was called before launch), then bought a
PS2. But now, for all the arguments that rage about APIs,
multiplatform availability of certain games, etc., I find it hard
to ignore the simple cost difference: PC games cost half as much
as PS3 games, I'm able to run the games at mad levels of detail
(twice the vertical line res of many PS2 games), and messing
around with the whole hw overclocking business is certainly a lot
of fun (783MHz core with the GF8800GT). In that sense, the PC as
a gaming platform is actually more than just a gaming system: I'm
pretty sure the kind of people who like to play games on good
spec PCs at high detail are very much the same people who are
into overclocking, etc. It's a natural match. I'll probably by a
PS3 at some point purely for its HD playback capability (far
cheaper than any dedicated player), but not until I have a decent
HD TV, and there's no point in buying an HD TV until the display
technology has matured (I'm waiting for OLED systems) and there
are plenty of HD channels worth watching with suitable content.

One other thing that persuades me of the continuing life of
the PC: the mod scene. I was so impressed at the plethora of
extensions available for Oblivion and Stalker. One could never
run out of more things to try, new maps, etc. But with the
consoles, the use of mods is very restricted and it's difficult
to sort out patch fixes for a particular game - the online
services for the consoles are just nothing like as open in how
they work as I would like and in some cases are not even free.
I know someone who bought the GoTY edition of Oblivion for his
PS3 and although he enjoyed it a lot, he kept finding quirks &
bugs which he was unable to fix because there was no way to
download patch fixes, in some cases quite bad ones (eg. an entire
side mission not available). In the end he sold the GoTY version
and replaced it with the standard edition which did not have
most of the various bugs. Indeed, it seems these days that many
console games are nowhere near as rigorously tested as they used
to be.


It's hard to say where the API 'war' is heading, the hardware
changes so fast. A single technical revolution could change
everything, eg. a GPU design using memristors and Qbits that
has true volumetric effects which currently do not exist at
all in any genuinely native form (fire, water, mud, smoke, snow,
ice, rain, etc.) Something like that would permit entirely new
kinds of games to be written.

It's certainly a pity OGL has slid in its relevance to modern hw
but no surprise given the infighting between consumer companies
and those supplying technical markets. Car companies have a lot
of influence anyway, so no wonder issues around CAD applications
have thrown the odd spanner in the works. SGI's legacy came out
of supplying visual simulation markets, which explains much of
OGL's early design (see my SGI site for details).

But all this makes me wonder about a key aspect of consoles
that is definitely a good thing: they're far more efficient
at making the best use of their gfx hardware than PCs.
Given the generation of GPU in the PS3, the realism in the
latest games is very impressive. On paper, the latest PC gfx
cards should be _massively_ faster, but why aren't the latest
PC games equivalently a lot more visually realistic?

Much is made of Crysis, used everywhere as a benchmark because
it taxes gfx cards so severely. Yet how do we know the fps
results with Crysis aren't lower purely because it's not written
very efficiently? Who has ever measured just what Crysis is
doing in terms of triangles/second, how much it's getting out
of the cards? I did try and ask someone at NVIDIA I used to know
about this (Ujesh Desai), but he didn't answer, too busy I guess
(when I was doing stuff for SGI he was the guy looking after the
O2 system I borrowed). Still, I do wish some site would look into
whether games are using gfx hw efficiently. A senior 3D developer
at Virtual Presence once told me that few PC apps ever get more
than about a third of peak performance out of any PC gfx card
before the technology has moved on to the next product. There's
never enough time to evolve the drivers to a decent degree. By
contrast, consoles traditionally end up squeezing every ounce of
possible speed out of the hw that can be obtained, though MS
seems to be breaking this somewhat by turning around products
much faster than has been the case in the past, ie. Xbox360 not
really out that long before the next MS console is released.

No doubt gfx cards will get ever faster, but to me it feels like
game writers are getting lazy, or at least I suspect so. And
meanwhile, for all that games continue to look ever nicer with
more visual effects, I still so strongly wish for a revolution
in _how_ games work rather than just visual realism, eg. properly
modifiable environments in the way Red Faction tried to do (a
grenade should blow up stuff no matter where one is, not just
where the game world 'allows' things to be damaged), a more
consistent continuous game world rather than simple 'spheres of
existence', more fluid AIs instead of the typical "every entity
is either friend or foe", and so on. Crysis has done some things
in some of these areas, but is still very lacking. My fear is
that so much focus is on 3D speed and realism, we may end up with
a gaming slump when people decide that the game play itself is
not that good even though the game looks incredible. I even heard
a comment recently on a TV gaming show, the presenter said about
the latest racer, "Yeah, yeah, we're all bored to death with
ultra-realistic driving sims. What's new??" Such an irony given
that only a couple of years earlier, all the focus for driving
games was on the ever improving realism.

See this old article for a good discussion of these issues:

http://www.sgidepot.co.uk/reflections.txt


At the end of the day though, all this effort is designed to make
money out of us consumers, so don't expect every change to be in
our interests. But as long as I can play games like Oblivion and
Stalker, I'll be happy.

Sorry for the long rant, just hope some of you find it interesting.

Ian.

PS. Enhanced gaming experience: watch Band of Brothers, *then*
play Call of Duty. Hot damn.... 8)

---

mapesdhs@yahoo.com

 
This article has a lot of truth, but is so........ Microsoft... it looks like is made by a Microsoft Fanboy, after reading it all i dont know if what i've read is true, it looks like is trying to make windows look awesome when it is not, there is wrong information on it, the PS3 doesn't use directx at all... it uses OpenGL.

I feel bad about this because i had always relayed on this page, but it looks like is trying to make microsoft and windows look better when everybody knows that they are falling, trying to force people to switch to vista to play on directx 10.. opengl is not bad as this article makes it look, just compare good xbox/pc games with good ps3 games, the war is not over, directx 11 is not here yet, and we haven't seen any game with the opengl 3 technology...
 
I don't know much about APIs (i'm studing it atm, so i'd love to know more, lol) but i DO know that when u don't have a set of instructions to do what you want, you go around them; my point is that the old OGL API for CAD software can be rendered in the CPU instead. I mean, new CPUs do leaps forward and are faster and faster (to some degree); now, if you can't use this OGL instruction, the interpreter can grab it and render it with the new set, right? Old 3D thingys (games and CAD) can run prefectly on renders, since modern hardware HAS the capability of rendering without a big performance penalty. Kinda like "emulating" PSX or gen4-6 consoles on PCs.

We all benefit from re-thinking the things we do from time to time and it sometimes makes us better, but we can fail. Re doing the OGL API (wich won't be re done completly) will bring more benefits than drawbacks IMO. Hope those big CAD players stand up and realize that.

Esop!
 
Aren't there any groups interested in defining an alternative API to OpenGL that is open? Groups that actually might have great ideas for designing a new API? Perhaps pool past resources. And get AMD's, Intel's and NVidia's backing as well as the gaming industry?
 
[citation][nom]DXrick[/nom]"The war is over"? As long as there are other platforms (like Mac, Linix, and consoles) the war will not be over.[/citation]
I think it depends on which war you're talking about. I think the author is referring to the war for developer mindshare. That and hardware developer buy-in, as well. If no one develops software for it, it'll just be yet another useless API. As for hardware support, while I support the idea of opensource drivers for graphics card, the hard fact remains that I get the best driver quality from the combination of NVidia hardware and NVidia proprietary drivers. Perhaps one day when I've a problem that can only be solved by looking into the driver code and can't do anything about it, I'd complain. But so far, the NVidia devs have actually done a good deal at keeping their code good, even when kernel devs change the playing field underneath 'em.
[citation]It is a shame that so much effort at updating OpenGL has resulted in so little improvements.Microsoft spent a lot of moola in the creation of DX10, including the creation of the DX API and working with graphics cards makers to standardize the features and requirements of the video cards.Why can't they just exploit this effort to update OpenGL to the same standards and features, instead of trying to reinvent the wheel?[/citation]
I think that's what the author meant when he said OpenGL was trying to bring OpenGL 2 up to feature-parity with DX10. Unfortunately, the DX10 spec itself is proprietary.

It seems that the CAD software developers have a lot of clout where Khronos efforts are concerned since they were able to stifle innovation. With a group protecting their interests like that, I asked in a previous post whether any non-partisan groups might be interested in starting a spec.
 
It is. And by your own admission the games look fine.

They sure do. Quake wars looks very fine too.

Honestly I couldn’t tell its ogl. Just by looking at the pictures I could have sworn its d3d, but it obviously isnt.

Why is OGL considered obsolete, and why is it slower in professional 3d programs that support both ogl and d3d, I’m not sure.

However many knowledgeable people seam to agree that OGL is in the underdog position. I dont know how well would quake or ps3 games run if they were d3d, but I can only judge from what i know-

In 3ds max, d3d is 10x faster at worst. Its 100x faster on average, and this is on professional card that is NOT does not cripple OGL in drivers. (on gaming cards difference is in factors over 1000- simply due to OGL crippling in drivers)

/

What I’m getting more interested in is OGL Fork, if I understood the term correctly.

Why cant there be an OGL 'boring n old' version for yer standard old n boring peeps using archaic software- car manufacturers and such.

And another OGL that is built to be modern ’lean and mean’ as they’ve said. 90% of users would benefit from this and the ’boring n old’ version could be there, it doesnt even need to be updated, since that user base wants it old n boring anyway. This way, compatibility is restored and OGL ’lean and mean’ doesnt even need to have backwards compatibility or anything else in common with the ’boring n old’ version.

Where is the problem with this solution? I think it covers all parties and everyone is happy- obviously, you’d name the ‘boring n old’ version a ‘classic’ or 'golden standard', so that old n boring peeps don’t find it offensive- voila!
 
I have quite high hopes for Windows 8, atleast I hope Microsoft will make it a pure 64bit platform (It'd be atleast six years down the road due to the current lifecycle on the Windows Platform)

This article was quite the interesting read. It's been awhile since I heard anything about OpenGL, and this explains why nicely. All that remains now is for people to stop slamming DX10 and Vista, expecially since Windows 7 will build apon the Vista platform.
 
[citation][nom]TormDK[/nom]I have quite high hopes for Windows 8, atleast I hope Microsoft will make it a pure 64bit platform (It'd be atleast six years down the road due to the current lifecycle on the Windows Platform)This article was quite the interesting read. It's been awhile since I heard anything about OpenGL, and this explains why nicely. All that remains now is for people to stop slamming DX10 and Vista, expecially since Windows 7 will build apon the Vista platform.[/citation]
I hear it will still have "32 bit" god forbid if none of the programs will be compatible,we're gonna make a new fuck fest with 32 bit(provided no processors or computers will come with less than 4GB of RAM then and will be 64 bit compatible,and have been since the last pentiums and athlons...I can get that in a 200 dollar PC)
By the way,it's windows 7.
 
1) First of all - D3D is Windows only (even worse - D3D10 is Vista-only), while OpenGL is everywhere possible. Shader Model 4 thingies in XP is only possible via OpenGL (3.0 now), not D3D!

2) Some of the developers IMO prefer Direct3D because of DirectX : with DX you have AIO kit e.g. for writing some game, while OpenGL is pure 3D API and everything else is needed to be done separately. So game studios prefer DX as a all-in-one system for all they need to make a game.

3) OpenGL is more general-purpose than D3D and D3D is more game-development-oriented. Everywhere outside gaming business OpenGL is still preffered API even on Windows. Also, D3D is more "closely developed" with Windows so it should run a bit better on Windows than OpenGL implementations and if it couldn't, that should be disappointed. Shortly, if D3D could be available on other platforms than Windows, it won't be as performance-efficient as it is with Windows. MS works very close with AMD/nVidia to have best possible results on it's own platform - Windows, while OpenGL implementations will obviously always be less optimized for Windows. P.S. It is known that MS used some dirty tricks to kill OpenGL back in far past (and fortunately it failed) so I'm very sure it is doing same constantly but more hiddenly everytime. The only one who will be glad with OpenGL death is MS and MS does Windows and now - can we expect OGL to beat D3D in performance on Windows? This is sad reality... But anyway : some people saying that D3D versions of Max and Maya are running 10x faster than OpenGL versions - that's either the fault of MS (hidden dirty war again) or bad coders of that software. D3D can't be 10 times faster than OGL by all means :)

4) As I already mentioned, OpenGL has to care about much more segments of realtime 3D than Direct3D has, and that was one of the explanations of the delay of it's new version. But now the new profiling feature should work around this, so game developers will have its own subset of API. Also, I personally (as well as majority of programmers as I see) am after dropping support for old features. The thing I hate most is backward compatibility wich causes delays for people who want to move forward. BTW MS is known for it's backward- and stupidward- compatibility wich becomes pain for not-so-conservative and not-so-stupid users. Vista and DX10 were rare exceptions in terms of backward compatibility but stupid-ward compatibility is preserved (many parts of Vista UI is primarily "designed" for stupid (not novice but stupid) people I think, BTW I'm the Vista user).

5) For your information : PlayStation uses modified OpenGL as far as I know and that's definitely not Direct3D :) And I have heard from the people that has played both XBox 360 and PS3 that later far outperforms the first one :)

6) At first author says that OGL2.1 was too outdated, then it says the new version is nothing but a small update and should be named 2.2 rather than 3.0, also he says that DX10 is a big step forward from DX9 and finally - he says that OGL 3.0 "barely keeps up" with D3D10 (but it keeps up after all, and with D3D 10.1 too as I understood). So how can we understand all this together?

7) Yes - the new version of OGL was a bit late (maybe very late), yes - many of the expectations became false and we didn't get what we wondered, yes - after version 10 Direct3D was a bit ahead of its rival, but I believe all this happened because Khronos needed some time to fully realize and take the control of the situation. So what do we have now? OpenGL and Direct3D are nearly at the same level at this precise moment and it doesn't matter what was in the past, does it? OpenGL gives developers everything D3D does (and who cares is it in core or via extensions?). There is NO DX11 YET so until it comes available how can we know what will OpenGL ARB Working Group of Khronos prepare for us? MS loves the announcements and popularization of things it has not done yet, so by the time DX11 will become available I'm very sure OGL will be updated at least once to meet new D3D levels. I hope Khronos is finally in charge of the situation now and there will be no delays any more!

8) So what's the future? One is for sure: D3D vs OpenGL battle is MS versus everyone else and it's very hard to imagine that MS will beat everyone else, isn't it? It's more predictable that someday oneday the monopoly of Microsoft will collapse and so will all its technologies (sadly for all MS-dependent developers including me). Almost everything MS does is of low quality with very few exceptions including dotNET and latest DirectX maybe, so I don't expect much from that company. (BTW I'm the user of- and developer for- the MS platform and I have no hate to it like some Mac and Linux fans, so I know what I'm talking about, just personal experience.)

Maybe my post is a bit Anti-MS style and the author's topic is a bit Microsoft-ish, so the truth is probably somewhere between :) But it's for sure that that's not definitely the end of war, the war has just begun ...

Georgian.
 
[citation][nom]jaragon13[/nom][/citation]

This I know, but I'm talking about the Windows Version after Windows 7 (Hence why I say it'll likely not be for another 6-7 years)
 
[citation][nom]eodeo[/nom][/citation]

As far as I know, the majority of OS X 10.5's kernel is 32 bit. The Cocoa API is 64 bit. Chess.app and Xcode.app, I believe, have X86_64 binaries. I think I read somewhere that apple is shooting for 10.6 to be "true" 64 bit (ie, 64 bit kernel and drivers).

The only reason *nix OS don't see the majority of games is because D3D is the de facto API in the game development community. D3D is entirely closed and locked down. End of story.

Until D3D is moved off the windows platform, or a more competitive cross-platform API comes along, *nix OSs won't see the big AAA titles. This is why Steve Jobs doesn't care about gaming.

They should just rewrite OpenGL for 4.0, drop legacy support, and modernize the API.
 
I'm still happily using DX9 on WinXP. I know I'll eventually have to move, but I won't give it up easily. I simply don't want games that cost >$100 and require an internet connection. I do a lot of my gaming when i commute, and I don't yet have wireless broadband (nor am I likely to fork out $1000-2000 over 24 months to get it just so I can play a few of the newer games).

Microsoft and the PC developers have lost the plot. THAT is why PC gaming is dying. It has nothing to do with competition from consoles.
 
Good article with lots of detail, but the opening really could have used an "executive summary" style conclusion so we didn't have to wade through the 8 pages of history to know what the author's positions or conclusions were.
 
As far as I know, the majority of OS X 10.5's kernel is 32 bit. The Cocoa API is 64 bit. Chess.app and Xcode.app, I believe, have X86_64 binaries. I think I read somewhere that apple is shooting for 10.6 to be "true" 64 bit (ie, 64 bit kernel and drivers)

Yep. Snow Leopard is supposed to be actual 64bit OS. Hopefully other Mac applications (Adobe, I mean you!) will turn to 64bit than. If the 64bit is still limited to Cocoa(whatever it is) and chess it would be a waste, since I doubt either need more than 2gb of ram.

They should just rewrite OpenGL for 4.0, drop legacy support, and modernize the API.

agreed
 
At the moment Windows is the dominant PC gaming platform, old news to everyone. But I see a trend for new games to be released to Mac OS X and Windows, for example: Spore, Starcraft 2 and Diablo 3.

I also do not see that this is a trend that is going to go away soon. And to my knowledge Mac OS X only supports OpenGL and not any if the Microsoft APIs'?

And seeing Mac OS X is a unix-like operating system does this make it easier to port games to other unix-like operating systems like GNU/Linux in the near future?

Regards,
 


I disagree with you. Although I do not use Linux very much, I do think there is an American company that will take Linux public and make it work. Who you may ask? It’s not too hard to figure it out. Google. They are coming out with their OS... showing it to the world tomorrow (23) with T-mobile. A Linux based Google OS on a mobile platform. They denied working on this earlier this year... but also denied working on a Computer OS as well. What’s in the pipe line? Will Google come out with their bread of OS? I think so. Maybe Microsoft knows this and maybe that’s why they are shaken by Google, shaken by a search engine.
 


The real question is not when they are moving to just 64bit, but 128. The new processors starting to come out in the near future will be 128bit compatible. I am not sure in Nehalem in, but I know Sandy Bridge is moving toward a 256-bit processor.

...The data path is widened from 128 bits to 256 bits...

Maybe I am mistaken by this, but that is what it sounds like to me. If I am so, please correct me.

I also hear that W8 will be 128-bit.


Evolution moves faster the more we evolve.
 
But I see a trend for new games to be released to Mac OS X and Windows, for example: Spore, Starcraft 2 and Diablo 3.

I don’t know about spore and its motives, but both SC2 and D3 are blizzard games. Blizzard ALWAYS shipped both PC and Mac version of the game at launch. They’re very committed to both operating systems.

What I also know is that all their 3d games default to D3D under windows. I always assumed this is to superior speed. I’d like to see a benchmark on same hardware and see how does the game run under Mac/OGL and Win/D3D.

Like i said pro programs run d3d 10x faster than ogl at the worst. Its 100x slower average. If you don’t believe me, download one of the many free 30day trials and run some benchmarks. Be forewarned that if you don’t own a professional graphic card, the OGL will be over 1000x slower than D3D, simply due to driver crippling. This difference is unrealistic and is much closer to 100x slower.


Will Google come out with their bread of OS? I think so.

The very first thing google would need is:
1) better OpenGL, or
2) d3d support, or
3) new 3d api that will be on par with d3d.

If they don’t do either of the 3, I don’t see a reason for anyone to use their OS over Linux distro like Ubuntu.

If they do one of the 3, but not 2, that would be the best overall case scenario- although if harder at start. That’s not mentioning software/driver compatibility support.

The real question is not when they are moving to just 64bit, but 128. The new processors starting to come out in the near future will be 128bit compatible. I am not sure in Nehalem in, but I know Sandy Bridge is moving toward a 256-bit processor.

Seeing how 64bit allows 6.7 exobytes of ram to be addressed, I don’t see a reason for 128bit computers- than again, i know so little about this field. Maybe 128bit introduces something other than larger memory to be addressed
 


It's Google, they would do something smart. Seeing the things they have come out with, I would assume that they would not only have D3D support and OGL, but also be able to run both .exe and whatever MAC executables are. Although it may be hard, they are the only people I think would do anything like this. Make a system that is truly universal. Not only that, they would give you what you need. No more, no less. Take Chrome as an example. Everything you need is there (they have a great menus system) but it is the cleanest looking browser I’ve seen. They would also cater to both the enthusiast and professional given speed due to a minimalist OS. I have no proof of this; I am just going off my ideas, my dreams, my hopes. I want to see a new OS, a good OS. I think Google is the only one that is capable on delivering. Hell, they have Microsoft worried and for good reasons. Chrome (an OS on a stick, if you aren’t sure, check it out. It has its own freaking task manager in it!) and soon to be Android. Their "first" OS.

Wow... do I own stock in Google? No.... I wish I did. I am just a slight fanboy for them! 😀




Maybe... Who knows why they would. I guess we can ask that about home computer and Quads with soon to be Hex and Octal cores (physical, double logical).
 
Quite a few comments here about what people think OSX has in
terms of 32bit vs. 64bit, and what Snow Leopard will have,
but little fact or certainty. Please read the following to
learn the correct info:

http://www.appleinsider.com/articles/08/08/26/road_to_mac_os_x_10_6_snow_leopard_64_bits.html

Far as I can tell, Snow Leopard is much more about vast improvements
in how the system makes use of multiple cores, threading, etc.,
something MS is pretty much not bothering with at all.

As always with these issues, it's not as simple as just 32bit vs.
64. Numerous parts of the internal hardware may have different
widths, and sometimes paths may be narrower or wider than 64bit
for other reasons, eg. narrower bus but at a faster speed, or
a wider bus at a slower speed. SGI's Challenge/Onyx design was a
fully 64bit system, but it had a 256bit data bus at a lowish
37MHz (easier to implement back then in the early 1990s) giving
1.2GB/sec, while the address bus was 40bit (more than enough for
the max 16GB RAM). It got round bandwidth limitations of having
to service as many as 36 CPUs by including features such as
shared RAM access requests (multiple CPUs ask for the same data,
all requests trapped and only one main request sent off, data
received, multiple copies sent back to all CPUs that requested
it). Making buses ever wider is not efficient though, thus the
move to scalable archiectures like Origin which don't need very
wide buses to achieve high bandwidth (not a bus-based system
infact).

So, the interconnect widths used inside Macs or other systems
are kinda irrelevant as to whether a system is 64bit or not. As
an earlier poster said, the addressable RAM space of a 64bit
register is still very large; no need for anything beyond this
yet, and a very different issue as to whether a system can in
some way process multiple 64bit data items in parallel which some
might call a design beyond 64bit (same kind of marketing spin
used by Nintendo for the N64). Though never implemented, SGI's
MDMX design for media processing involved a 192bit accumulator
register, but that would not have meant a MIPS-V system using
it would have been a 192bit system. 😀

An awful lot of twaddle is written about 32 vs. 64 bit. Many
forget that some apps as written will run slower when run
as 64bit (all data requests 2X larger), unless hardware changes
mitigate the higher bandwidth requirement.

Ian.

 
Status
Not open for further replies.