-[citation][nom]cablechewer[/nom]BTW, some other things that hurt OS/2 were (all of these are subjective and based on my memory): -version 2.0 was buggy. I remember it was rushed out the door to try and combat Windows 3.0. The price was that, for a while, OS/2 earned a poor reputation. Everything was fixed with 2.1, but by then the ship had sailed and IBM was still on the dock. -I had several people comment that Windows 3.0 and 3.1 were prettier and easier than my OS/2 desktop. As long as you didn't have any problems that may have been true, but in my experience OS/2 was easier to work with. -Once Microsoft had the installed base they got all the attention from the driver and hardware manufacturers. IBM was focused on business, but Microsoft had the consumer market sewn up. For a vendor looking to release a new sound or video card this made OS/2 a low priority and helped force OS/2 into a downward spiral where people wouldn't buy an OS with poor hardware support and vendors were slow to support it because the installed base was too small. I think this is part of why linux doesn't take over now. -of course the DOJ case has other reasons for how this went, but that is a whole other can of worms that I won't come any closer to touching.Oh well, water under the bridge now though I really miss the object based GUI. Microsoft has yet to design an interface I like half as much as the old one from OS/2.[/citation]
2.0 was buggy, but it wasn't 2.11 that fixed it, it was CSD4064. It was past the point where it mattered by then, and really had nothing to do with it. OS/2 2.0 was still better than Windows, a lot better. It wasn't even rushed that created the problems, it was actually late. The decision was made to include the Workplace Shell as part of the operating system, and this added a lot of complexity. I agree though, and in fact I voiced that opinion before it was released, 2.0 was too buggy. But, even if it had been perfect, it would have changed nothing.
Device Drivers were hardly an issue, IBM had a huge market share, and they supplied device drivers for their products. At that time, you emulated IBM hardware anyway, so if you wanted to create a video card, it better be 8514/A or XGA compatible. It was not an issue.
Neither was the consumer market, which was very small indeed in 1992. Windows was very popular in the business market even then. The home market was very small.
The remarks about Windows make no sense to me. OS/2 1.x looked just like Windows, and OS/2 2.x and later had Windows inside of it, so you could get exactly the same appearance when you brought up Win-OS/2. It looked primitive and was powerless by comparison, but, I guess there are some that liked it, because they were familiar with it. That's easy to understand, but, it was still there, and unless you were running OS/2 apps, you could run fine in your Windows environment inside of OS/2.
Linux is Unix, and Unix will never be mainstream. I've said this for 25 years, and I keep hearing from the fringes how "soon" it's going to take over. It won't, it's hostile, ugly, and poorly conceived. They try to hide it under an interface, which does help, but Unix is ugly underneath. If you think Grep is intuitive, or switches that have different meanings if it is uppercase or lowercase, then Unix is for you. It's always been less reliable than real operating systems like MVS, but less user friendly than operating systems developed after it like OS/2 and Windows NT. That's the difference. Unix is an old operating system, whereas Windows NT got to see what was good with Unix, and bad, and had the luxury of making decisions with more knowledge. Unix was never intended to be mainstream, Windows NT was. But, believe me, long after you and I are dead, someone will be saying that Unix will come to life. It's a mantra that never dies, despite it never coming true. But, anything is possible. Would anyone have predicted the 8086 would be so important? Even Intel considered the iAPX432 the more important processor, and Motorola's 68K was much more powerful and elegant. But, despite it being ugly, inefficient, and something we all pay for today with lost performance and extra power use, x86 is still alive today. So, I could EASILY be wrong about Unix, but after so many years, I see no movement towards it, whereas x86 became important much more quickly.
Clearly, for servers though, it's been somewhat successful, and scientific stuff, but the mainstream market has so far never embraced Unix. Apple is doing the best, by isolating the ugly internals of it from the user, which is Apple's way. Maybe if their hardware costs weren't so damning they'd have a chance, but, Apple is always going to be a fringe player because that's where they want to be. Since the Apple II on, they've overpriced their machines, and never wanted to make an inexpensive computer and gain a lot of market share. So, I don't think we can expect them to push Unix to the mainstream, even if the OS is not the fault. Their hardware is just too expensive for what it is.
2.0 was buggy, but it wasn't 2.11 that fixed it, it was CSD4064. It was past the point where it mattered by then, and really had nothing to do with it. OS/2 2.0 was still better than Windows, a lot better. It wasn't even rushed that created the problems, it was actually late. The decision was made to include the Workplace Shell as part of the operating system, and this added a lot of complexity. I agree though, and in fact I voiced that opinion before it was released, 2.0 was too buggy. But, even if it had been perfect, it would have changed nothing.
Device Drivers were hardly an issue, IBM had a huge market share, and they supplied device drivers for their products. At that time, you emulated IBM hardware anyway, so if you wanted to create a video card, it better be 8514/A or XGA compatible. It was not an issue.
Neither was the consumer market, which was very small indeed in 1992. Windows was very popular in the business market even then. The home market was very small.
The remarks about Windows make no sense to me. OS/2 1.x looked just like Windows, and OS/2 2.x and later had Windows inside of it, so you could get exactly the same appearance when you brought up Win-OS/2. It looked primitive and was powerless by comparison, but, I guess there are some that liked it, because they were familiar with it. That's easy to understand, but, it was still there, and unless you were running OS/2 apps, you could run fine in your Windows environment inside of OS/2.
Linux is Unix, and Unix will never be mainstream. I've said this for 25 years, and I keep hearing from the fringes how "soon" it's going to take over. It won't, it's hostile, ugly, and poorly conceived. They try to hide it under an interface, which does help, but Unix is ugly underneath. If you think Grep is intuitive, or switches that have different meanings if it is uppercase or lowercase, then Unix is for you. It's always been less reliable than real operating systems like MVS, but less user friendly than operating systems developed after it like OS/2 and Windows NT. That's the difference. Unix is an old operating system, whereas Windows NT got to see what was good with Unix, and bad, and had the luxury of making decisions with more knowledge. Unix was never intended to be mainstream, Windows NT was. But, believe me, long after you and I are dead, someone will be saying that Unix will come to life. It's a mantra that never dies, despite it never coming true. But, anything is possible. Would anyone have predicted the 8086 would be so important? Even Intel considered the iAPX432 the more important processor, and Motorola's 68K was much more powerful and elegant. But, despite it being ugly, inefficient, and something we all pay for today with lost performance and extra power use, x86 is still alive today. So, I could EASILY be wrong about Unix, but after so many years, I see no movement towards it, whereas x86 became important much more quickly.
Clearly, for servers though, it's been somewhat successful, and scientific stuff, but the mainstream market has so far never embraced Unix. Apple is doing the best, by isolating the ugly internals of it from the user, which is Apple's way. Maybe if their hardware costs weren't so damning they'd have a chance, but, Apple is always going to be a fringe player because that's where they want to be. Since the Apple II on, they've overpriced their machines, and never wanted to make an inexpensive computer and gain a lot of market share. So, I don't think we can expect them to push Unix to the mainstream, even if the OS is not the fault. Their hardware is just too expensive for what it is.