Ultimate in over-the-top cell speculation. Intel manufactu..

G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

Greetings!

http://hardware.itmanagersjournal.com/hardware/05/03/03/0226235.shtml?tid=78

The article quotes at length one Jim Trounson, who is part of group
that is developing a PCI-X card for Cell, or so they say.

Best science fiction of 2005 already awarded?

<quote>


Cell Industries predicts that Intel will be building Cell with
cooperation from IBM within a year.

Cell, software, and Microsoft's demise

For the anticipated finale, and the end of Microsoft dominance as we
know it, Trounson forecast that IBM will not give Microsoft hardware
to work with, and will cash in on its support for open source and
Linux.

<snip>

Cell Industries forecasts that as Intel begins producing Cell chips,
Microsoft will try to port its operating system to the new processor.
However, Linux will have a significant head start and Microsoft will
in turn "fall apart."

"When hardware is commercially available, Windows will take two to
three years to get the first version going," Trounson said. "IBM
already has Linux running on the Cell [at that point]."

Adding that Cell chips will be in short supply for years, Trounson
acknowledged that the prediction represents the unprecedented.

"The world has never seen a step change in technology like what is
about to occur," Trounson said.

</quote>

....and then I woke up.

RM
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

Robert Myers wrote:
> Greetings!
>
> http://hardware.itmanagersjournal.com/hardware/05/03/03/0226235.shtml?tid=78
>
> The article quotes at length one Jim Trounson, who is part of group
> that is developing a PCI-X card for Cell, or so they say.

Crackpots can come from all industries. :)

> Cell Industries predicts that Intel will be building Cell with
> cooperation from IBM within a year.

He would've been more believable if he said AMD is going to start
building Cell, since afterall AMD and IBM have been synchronizing their
process technologies recently. So has Chartered.

> Cell Industries forecasts that as Intel begins producing Cell chips,
> Microsoft will try to port its operating system to the new processor.
> However, Linux will have a significant head start and Microsoft will
> in turn "fall apart."

Sort of like how Microsoft fell apart after falling two years behind
Linux in the x86-64 arena, I guess?

> Adding that Cell chips will be in short supply for years, Trounson
> acknowledged that the prediction represents the unprecedented.

I see he's already got his fallback in case his predictions inevitably
don't come true: Cell chips will be in short supply that's why it didn't
take off.

> "The world has never seen a step change in technology like what is
> about to occur," Trounson said.

Not since, ... oh Itanium, and then later Transmeta.

Yousuf Khan
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Fri, 25 Mar 2005 10:28:46 -0500, Robert Myers <rmyers1400@comcast.net>
wrote:

>Greetings!
>
>http://hardware.itmanagersjournal.com/hardware/05/03/03/0226235.shtml?tid=78
>
>The article quotes at length one Jim Trounson, who is part of group
>that is developing a PCI-X card for Cell, or so they say.

Uhh, does he mean a PCI-E card? Why the hell would anybody be interested
in a PCI-X card for a future system? It would be err, good to get that bit
right before proceeding further.

>Best science fiction of 2005 already awarded?
>
><quote>
>
>
>Cell Industries predicts that Intel will be building Cell with
>cooperation from IBM within a year.

.... and pigs will fly! I gotta see this one.

>Cell, software, and Microsoft's demise
>
>For the anticipated finale, and the end of Microsoft dominance as we
>know it, Trounson forecast that IBM will not give Microsoft hardware
>to work with, and will cash in on its support for open source and
>Linux.

B-b-b-but his *own* model is founded on open hardware specs. How could
anybody stop M$ from getting their hands on it?

>Cell Industries forecasts that as Intel begins producing Cell chips,
>Microsoft will try to port its operating system to the new processor.
>However, Linux will have a significant head start and Microsoft will
>in turn "fall apart."
>
>"When hardware is commercially available, Windows will take two to
>three years to get the first version going," Trounson said. "IBM
>already has Linux running on the Cell [at that point]."
>
>Adding that Cell chips will be in short supply for years, Trounson
>acknowledged that the prediction represents the unprecedented.
>
>"The world has never seen a step change in technology like what is
>about to occur," Trounson said.
>
></quote>

One "little" flaw I see - there is talk of:

IBM, on the other hand, will "recruit an army of developers" during
the first year of Cell production by supplying software development systems
-- as many as 100,000 -- to major application developers and large
companies, as Trounson told ITMJ.

Who is going to pay for the hardware and software for development? IBM has
not been good at giving anything away, even to developers and certainly not
speculatively. That was the main reason for the failure of OS/2. I've
also mentioned in the past that we, and others, coughed up $$ for Risc/6K
and Alpha... all for nothing... money down the drain - we won't do that
again. OTOH I have to confess I do not understand the open source business
"model".<shrug>

Off-hand, other things: 1) I don't see the XDR memory sub-system being
amenable to memory "strips" and even with 1Gbit chips, 512MB of memory per
CPU is kinda slim... without reworking the memory interface to get to 8GB
per CPU; 2) 32-bit FPU is not going to fly as a general purpose computer.

>...and then I woke up.

I hope this guy has a spare grungy garage for his efforts - seems like that
is part of the template for success he is aiming to emulate... C.F. Dell,
Apple, et.al.:)

--
Rgds, George Macdonald
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Fri, 25 Mar 2005 19:00:55 -0500, George Macdonald
<fammacd=!SPAM^nothanks@tellurian.com> wrote:

>On Fri, 25 Mar 2005 10:28:46 -0500, Robert Myers <rmyers1400@comcast.net>
>wrote:
>
>>Greetings!
>>
>>http://hardware.itmanagersjournal.com/hardware/05/03/03/0226235.shtml?tid=78
>>
>>The article quotes at length one Jim Trounson, who is part of group
>>that is developing a PCI-X card for Cell, or so they say.
>
>Uhh, does he mean a PCI-E card? Why the hell would anybody be interested
>in a PCI-X card for a future system? It would be err, good to get that bit
>right before proceeding further.
>
Especially since such a card is almost certainly going to be
I/O-bound.

>>Best science fiction of 2005 already awarded?
>>
>><quote>
>>
>>
>>Cell Industries predicts that Intel will be building Cell with
>>cooperation from IBM within a year.
>
>... and pigs will fly! I gotta see this one.
>
>>Cell, software, and Microsoft's demise
>>
>>For the anticipated finale, and the end of Microsoft dominance as we
>>know it, Trounson forecast that IBM will not give Microsoft hardware
>>to work with, and will cash in on its support for open source and
>>Linux.
>
>B-b-b-but his *own* model is founded on open hardware specs. How could
>anybody stop M$ from getting their hands on it?
>
I guess he's assuming that M$ can't go buy a PS3 for some reason.

>>Cell Industries forecasts that as Intel begins producing Cell chips,
>>Microsoft will try to port its operating system to the new processor.
>>However, Linux will have a significant head start and Microsoft will
>>in turn "fall apart."
>>
>>"When hardware is commercially available, Windows will take two to
>>three years to get the first version going," Trounson said. "IBM
>>already has Linux running on the Cell [at that point]."
>>
>>Adding that Cell chips will be in short supply for years, Trounson
>>acknowledged that the prediction represents the unprecedented.
>>
>>"The world has never seen a step change in technology like what is
>>about to occur," Trounson said.
>>
>></quote>
>
>One "little" flaw I see - there is talk of:
>
>
IBM, on the other hand, will "recruit an army of developers" during
>the first year of Cell production by supplying software development systems
>-- as many as 100,000 -- to major application developers and large
>companies, as Trounson told ITMJ.
>
>Who is going to pay for the hardware and software for development? IBM has
>not been good at giving anything away, even to developers and certainly not
>speculatively. That was the main reason for the failure of OS/2. I've
>also mentioned in the past that we, and others, coughed up $$ for Risc/6K
>and Alpha... all for nothing... money down the drain - we won't do that
>again. OTOH I have to confess I do not understand the open source business
>"model".<shrug>
>
Umm, I guess you don't. :).

That's why SCO is taking aim at IBM. Without IBM pumping its own
serious money into Linux, Linux would be nowhere near where it is now,
and IBM _is_ giving stuff away. In return, it has a nice growing
Linux server business (and a pesky lawsuit, to be sure).

I don't see anything wrong with the idea of IBM funding relevant
development, but I think it very unlikely that IBM will go after
anything that would wind up in a PC...unless, of course, IBM had
something _really_ devious in mind in selling off its PC business.

>Off-hand, other things: 1) I don't see the XDR memory sub-system being
>amenable to memory "strips" and even with 1Gbit chips, 512MB of memory per
>CPU is kinda slim... without reworking the memory interface to get to 8GB
>per CPU;

Have you looked at the I/O bandwidth?

http://research.scea.com/research/html/CellGDC05/07.html

Four cell processors=2GB. Probably no more NUMA than Opteron.

>2) 32-bit FPU is not going to fly as a general purpose computer.
>
SPE's can do IEEE-compliant double precision. Just ten times more
slowly.

>>...and then I woke up.
>
>I hope this guy has a spare grungy garage for his efforts - seems like that
>is part of the template for success he is aiming to emulate... C.F. Dell,
>Apple, et.al.:)

I don't think Trounson is _necessarily_ wrong about how important Cell
might be, but that clunker about PCI-X is hard to get past, never mind
the wild speculation about Intel. Maybe he just had too much coffee
and too little sleep and never figured anyone would be so desperate as
to write a web article off his email.

RM
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Fri, 25 Mar 2005 19:52:59 -0500, Robert Myers <rmyers1400@comcast.net>
wrote:

>On Fri, 25 Mar 2005 19:00:55 -0500, George Macdonald
><fammacd=!SPAM^nothanks@tellurian.com> wrote:

>>B-b-b-but his *own* model is founded on open hardware specs. How could
>>anybody stop M$ from getting their hands on it?
>>
>I guess he's assuming that M$ can't go buy a PS3 for some reason.

You mean like they obviously couldn't go and buy Apple systems to practice
on for XBox 2?:)

>>One "little" flaw I see - there is talk of:
>>
>>
IBM, on the other hand, will "recruit an army of developers" during
>>the first year of Cell production by supplying software development systems
>>-- as many as 100,000 -- to major application developers and large
>>companies, as Trounson told ITMJ.
>>
>>Who is going to pay for the hardware and software for development? IBM has
>>not been good at giving anything away, even to developers and certainly not
>>speculatively. That was the main reason for the failure of OS/2. I've
>>also mentioned in the past that we, and others, coughed up $$ for Risc/6K
>>and Alpha... all for nothing... money down the drain - we won't do that
>>again. OTOH I have to confess I do not understand the open source business
>>"model".<shrug>
>>
>Umm, I guess you don't. :).

No, I just I don't see how programmers are supposed to pay the rent, unless
maybe they've been anointed by one of the self-appointed OS-gurus.

>That's why SCO is taking aim at IBM. Without IBM pumping its own
>serious money into Linux, Linux would be nowhere near where it is now,
>and IBM _is_ giving stuff away. In return, it has a nice growing
>Linux server business (and a pesky lawsuit, to be sure).
>
>I don't see anything wrong with the idea of IBM funding relevant
>development, but I think it very unlikely that IBM will go after
>anything that would wind up in a PC...unless, of course, IBM had
>something _really_ devious in mind in selling off its PC business.

Giving stuff away and giving it to the right people are two different
scenarios. If you've ever been on the good end of an IBM give-way, you'll
know that it is not a comfortable position. As for the PC, it is not going
away any time soon, so there'd better be some vision of how Cell fits into
that slot... Apple's second chance??:)

>>Off-hand, other things: 1) I don't see the XDR memory sub-system being
>>amenable to memory "strips" and even with 1Gbit chips, 512MB of memory per
>>CPU is kinda slim... without reworking the memory interface to get to 8GB
>>per CPU;
>
>Have you looked at the I/O bandwidth?
>
>http://research.scea.com/research/html/CellGDC05/07.html
>
>Four cell processors=2GB. Probably no more NUMA than Opteron.

Well it would seem that the inter-CPU communications/coherency is less well
defined for the moment and there's a *big* difference between the current
256MB/CPU of Cell and Opteron's 16GB/CPU.

>>2) 32-bit FPU is not going to fly as a general purpose computer.
>>
>SPE's can do IEEE-compliant double precision. Just ten times more
>slowly.
>
>>>...and then I woke up.
>>
>>I hope this guy has a spare grungy garage for his efforts - seems like that
>>is part of the template for success he is aiming to emulate... C.F. Dell,
>>Apple, et.al.:)
>
>I don't think Trounson is _necessarily_ wrong about how important Cell
>might be, but that clunker about PCI-X is hard to get past, never mind
>the wild speculation about Intel. Maybe he just had too much coffee
>and too little sleep and never figured anyone would be so desperate as
>to write a web article off his email.

It's hard to fathom what *might* be sitting in a lab right now or what NDAs
might be in place, but as it stands, it appears that there's a lot fo work
to do to bring it into use in a general purpose computer.

--
Rgds, George Macdonald
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Sat, 26 Mar 2005 05:15:27 -0500, George Macdonald
<fammacd=!SPAM^nothanks@tellurian.com> wrote:

>On Fri, 25 Mar 2005 19:52:59 -0500, Robert Myers <rmyers1400@comcast.net>
>wrote:
>
>>On Fri, 25 Mar 2005 19:00:55 -0500, George Macdonald
>><fammacd=!SPAM^nothanks@tellurian.com> wrote:
>

<snip>

>
>>>One "little" flaw I see - there is talk of:
>>>
>>>
IBM, on the other hand, will "recruit an army of developers" during
>>>the first year of Cell production by supplying software development systems
>>>-- as many as 100,000 -- to major application developers and large
>>>companies, as Trounson told ITMJ.
>>>
>>>Who is going to pay for the hardware and software for development? IBM has
>>>not been good at giving anything away, even to developers and certainly not
>>>speculatively. That was the main reason for the failure of OS/2. I've
>>>also mentioned in the past that we, and others, coughed up $$ for Risc/6K
>>>and Alpha... all for nothing... money down the drain - we won't do that
>>>again. OTOH I have to confess I do not understand the open source business
>>>"model".<shrug>
>>>
>>Umm, I guess you don't. :).
>
>No, I just I don't see how programmers are supposed to pay the rent, unless
>maybe they've been anointed by one of the self-appointed OS-gurus.
>
This is a big subject, and I won't insult you by taking a weak flyer
at it. The google

economics "open source"

would be a good start.

>>That's why SCO is taking aim at IBM. Without IBM pumping its own
>>serious money into Linux, Linux would be nowhere near where it is now,
>>and IBM _is_ giving stuff away. In return, it has a nice growing
>>Linux server business (and a pesky lawsuit, to be sure).
>>
>>I don't see anything wrong with the idea of IBM funding relevant
>>development, but I think it very unlikely that IBM will go after
>>anything that would wind up in a PC...unless, of course, IBM had
>>something _really_ devious in mind in selling off its PC business.
>
>Giving stuff away and giving it to the right people are two different
>scenarios. If you've ever been on the good end of an IBM give-way, you'll
>know that it is not a comfortable position. As for the PC, it is not going
>away any time soon, so there'd better be some vision of how Cell fits into
>that slot... Apple's second chance??:)
>
The more I look at Cell, the more I am convinced I don't understand
how it will be used. Or rather, I can imagine ways in which it can be
used, but I'm not sure those are those only ways. The more I look at
the architecture, the more I like it, and I see lots of possibilities.

It's easiest to imagine the SPE's processing a bunch of content or
doing number crunching as a stream processor, but I can also imagine
using all those SPE's to overcome the natural limitations of the
in-order PPC: Spin off a task speculatively (or on less than perfect
information), execute in local memory, and commit only when whatever
predicate conditions are satisfied (or throw the result away).

The SPE's can also be isolated (I think) from the world of everyday
interrupts, and I think that might offer some serious advantages for
the processor.

But the question, of course, is not, are there interesting things one
might try, but will any of those things actually be made to work and
what do you get as a payoff. It seems reasonably certain you could
make Cell function as a PC processor if you wanted to. The question
is: why would you want to?

David Wang is worried about the software model. That doesn't worry me
so much. The fact that Sony is in such turmoil and has never been
able to make the "profit is in the content" model really pay off for
them (and, as far as I can tell, only Apple, in a field of many
entrants, has succeeded at that game). A weakened and distracted Sony
with a sagging stock price and turmoil at the top is going to turn
aside one of the biggest tidal waves in the history of technology
(x86)?

>>>Off-hand, other things: 1) I don't see the XDR memory sub-system being
>>>amenable to memory "strips" and even with 1Gbit chips, 512MB of memory per
>>>CPU is kinda slim... without reworking the memory interface to get to 8GB
>>>per CPU;
>>
>>Have you looked at the I/O bandwidth?
>>
>>http://research.scea.com/research/html/CellGDC05/07.html
>>
>>Four cell processors=2GB. Probably no more NUMA than Opteron.
>
>Well it would seem that the inter-CPU communications/coherency is less well
>defined for the moment and there's a *big* difference between the current
>256MB/CPU of Cell and Opteron's 16GB/CPU.
>
Maybe an issue if you want to use it for in-memory databases or a
server, but not so much so for computationally-intensive work.

>>>2) 32-bit FPU is not going to fly as a general purpose computer.
>>>
>>SPE's can do IEEE-compliant double precision. Just ten times more
>>slowly.
>>
>>>>...and then I woke up.
>>>
>>>I hope this guy has a spare grungy garage for his efforts - seems like that
>>>is part of the template for success he is aiming to emulate... C.F. Dell,
>>>Apple, et.al.:)
>>
And I think he's got the wrong product. If IBM isn't working on a
Blue Gene style card already, I'll be amazed.

>>I don't think Trounson is _necessarily_ wrong about how important Cell
>>might be, but that clunker about PCI-X is hard to get past, never mind
>>the wild speculation about Intel. Maybe he just had too much coffee
>>and too little sleep and never figured anyone would be so desperate as
>>to write a web article off his email.
>
>It's hard to fathom what *might* be sitting in a lab right now or what NDAs
>might be in place, but as it stands, it appears that there's a lot fo work
>to do to bring it into use in a general purpose computer.

The "front-end" is a PowerPC. Multi-threaded and in-order, but a
PowerPC, nevertheless. The compiler exists. I'll bet there is even
significant experience getting it to work with DSP coprocessors.

There is always the cautionary tale of itanium (which could wind up
looking more than a little bit like Cell). Intel was much better
positioned than Sony, it had much greater resources, and how far has
it gotten?

RM
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

"Robert Myers" <rmyers1400@comcast.net> wrote in message
news:5gha41dcub1ebt8roo6c8l7jmjlv4v0ned@4ax.com...
>
> Spin off a task speculatively (or on less than perfect
> information), execute in local memory, and commit only when whatever
> predicate conditions are satisfied (or throw the result away).

Wow! Hand-tuned assembly language whose carefully crafted results get
thrown out. That looks like a very efficient way to develop modern
software! ;-)
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Sat, 26 Mar 2005 13:58:14 GMT, "Felger Carbon" <fmsfnf@jfoops.net>
wrote:

>"Robert Myers" <rmyers1400@comcast.net> wrote in message
>news:5gha41dcub1ebt8roo6c8l7jmjlv4v0ned@4ax.com...
>>
>> Spin off a task speculatively (or on less than perfect
>> information), execute in local memory, and commit only when whatever
>> predicate conditions are satisfied (or throw the result away).
>
>Wow! Hand-tuned assembly language whose carefully crafted results get
>thrown out. That looks like a very efficient way to develop modern
>software! ;-)
>
I wasn't expecting it to be produced as hand-tuned assembly. You
forget my involvement with Itanium. Everything will be possible with
a compiler...one day.

Itanium compilers are already a fair bit of the way down this road.
You identify a task you can't be sure is safe because of data
amiguity. You set a predicate condition, execute the task, and check
the predicate.

With multiple execution units sitting on a bus connected to the CPU,
you don't have to wring your hands so much over the costs of spinning
off an execution path without full information. It should be no
harder than itanium predicated execution and maybe much easier.

RM
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Sat, 26 Mar 2005 07:29:33 -0500, Robert Myers <rmyers1400@comcast.net>
wrote:

>On Sat, 26 Mar 2005 05:15:27 -0500, George Macdonald
><fammacd=!SPAM^nothanks@tellurian.com> wrote:
>
>>On Fri, 25 Mar 2005 19:52:59 -0500, Robert Myers <rmyers1400@comcast.net>
>>wrote:
>>
>>>On Fri, 25 Mar 2005 19:00:55 -0500, George Macdonald
>>><fammacd=!SPAM^nothanks@tellurian.com> wrote:
>>
>
><snip>
>
>>
>>>>One "little" flaw I see - there is talk of:
>>>>
>>>>
IBM, on the other hand, will "recruit an army of developers" during
>>>>the first year of Cell production by supplying software development systems
>>>>-- as many as 100,000 -- to major application developers and large
>>>>companies, as Trounson told ITMJ.
>>>>
>>>>Who is going to pay for the hardware and software for development? IBM has
>>>>not been good at giving anything away, even to developers and certainly not
>>>>speculatively. That was the main reason for the failure of OS/2. I've
>>>>also mentioned in the past that we, and others, coughed up $$ for Risc/6K
>>>>and Alpha... all for nothing... money down the drain - we won't do that
>>>>again. OTOH I have to confess I do not understand the open source business
>>>>"model".<shrug>
>>>>
>>>Umm, I guess you don't. :).
>>
>>No, I just I don't see how programmers are supposed to pay the rent, unless
>>maybe they've been anointed by one of the self-appointed OS-gurus.
>>
>This is a big subject, and I won't insult you by taking a weak flyer
>at it. The google
>
>economics "open source"
>
>would be a good start.

Oh I've already read a bit on it and it just doesn't make sense to me. One
case in point: an often mentioned OS factoid has "geeks" playing in their
"spare time" to create software; if, as usually presented, they are also
professional programmers "during the day", they are very likely breaking
their employment agreement. Add in the fact that many (most) professional
programmers work *some* overtime for their employers and often at odd
hours, under pressure, the whole concept of OS is a fantasy to me.

>>>That's why SCO is taking aim at IBM. Without IBM pumping its own
>>>serious money into Linux, Linux would be nowhere near where it is now,
>>>and IBM _is_ giving stuff away. In return, it has a nice growing
>>>Linux server business (and a pesky lawsuit, to be sure).
>>>
>>>I don't see anything wrong with the idea of IBM funding relevant
>>>development, but I think it very unlikely that IBM will go after
>>>anything that would wind up in a PC...unless, of course, IBM had
>>>something _really_ devious in mind in selling off its PC business.
>>
>>Giving stuff away and giving it to the right people are two different
>>scenarios. If you've ever been on the good end of an IBM give-way, you'll
>>know that it is not a comfortable position. As for the PC, it is not going
>>away any time soon, so there'd better be some vision of how Cell fits into
>>that slot... Apple's second chance??:)
>>
>The more I look at Cell, the more I am convinced I don't understand
>how it will be used. Or rather, I can imagine ways in which it can be
>used, but I'm not sure those are those only ways. The more I look at
>the architecture, the more I like it, and I see lots of possibilities.
>
>It's easiest to imagine the SPE's processing a bunch of content or
>doing number crunching as a stream processor, but I can also imagine
>using all those SPE's to overcome the natural limitations of the
>in-order PPC: Spin off a task speculatively (or on less than perfect
>information), execute in local memory, and commit only when whatever
>predicate conditions are satisfied (or throw the result away).
>
>The SPE's can also be isolated (I think) from the world of everyday
>interrupts, and I think that might offer some serious advantages for
>the processor.
>
>But the question, of course, is not, are there interesting things one
>might try, but will any of those things actually be made to work and
>what do you get as a payoff. It seems reasonably certain you could
>make Cell function as a PC processor if you wanted to. The question
>is: why would you want to?

So my question is: what else (useful) will you do with it?... make ASPs out
of it? I don't think so - even IT can't make its politics work there. If
you can build game boxes and super computers with it, why not PCs? As
Apple's next (or next/next) CPU it may not be that far fetched - obviously
they already have the PPC part down.

>David Wang is worried about the software model. That doesn't worry me
>so much. The fact that Sony is in such turmoil and has never been
>able to make the "profit is in the content" model really pay off for
>them (and, as far as I can tell, only Apple, in a field of many
>entrants, has succeeded at that game). A weakened and distracted Sony
>with a sagging stock price and turmoil at the top is going to turn
>aside one of the biggest tidal waves in the history of technology
>(x86)?

I agree with David - the software environment is necessarily horribly
complex and AFAICT at this stage, needs programmers of a calibre which is
not commonly found... near genius even. Mr. Trounson's runtime compiler is
a *very* old idea, which has had no takers till now.

>>>>Off-hand, other things: 1) I don't see the XDR memory sub-system being
>>>>amenable to memory "strips" and even with 1Gbit chips, 512MB of memory per
>>>>CPU is kinda slim... without reworking the memory interface to get to 8GB
>>>>per CPU;
>>>
>>>Have you looked at the I/O bandwidth?
>>>
>>>http://research.scea.com/research/html/CellGDC05/07.html
>>>
>>>Four cell processors=2GB. Probably no more NUMA than Opteron.
>>
>>Well it would seem that the inter-CPU communications/coherency is less well
>>defined for the moment and there's a *big* difference between the current
>>256MB/CPU of Cell and Opteron's 16GB/CPU.
>>
>Maybe an issue if you want to use it for in-memory databases or a
>server, but not so much so for computationally-intensive work.

They're not even in the same ballpark. We already hear talk of (PC) game
developers raving about the >4GB address space of x86-64 and what they're
going to do with it; I guess Sony is not anticipating doing similar things
for PS3 players??

>>>I don't think Trounson is _necessarily_ wrong about how important Cell
>>>might be, but that clunker about PCI-X is hard to get past, never mind
>>>the wild speculation about Intel. Maybe he just had too much coffee
>>>and too little sleep and never figured anyone would be so desperate as
>>>to write a web article off his email.
>>
>>It's hard to fathom what *might* be sitting in a lab right now or what NDAs
>>might be in place, but as it stands, it appears that there's a lot fo work
>>to do to bring it into use in a general purpose computer.
>
>The "front-end" is a PowerPC. Multi-threaded and in-order, but a
>PowerPC, nevertheless. The compiler exists. I'll bet there is even
>significant experience getting it to work with DSP coprocessors.

It still looks like a steep slope to me... starting with the memory
interface. Dave has outlined how to do it, to get to 4GB with 512Mb chips,
but until it's actually done we don't really know.

>There is always the cautionary tale of itanium (which could wind up
>looking more than a little bit like Cell). Intel was much better
>positioned than Sony, it had much greater resources, and how far has
>it gotten?

So you're not tempted to have a little flutter on RMBS? The pump 'n'
dumpers seem to have gone cold on it with the Infineon deal - are they not
paying attention?:)

--
Rgds, George Macdonald
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Sat, 26 Mar 2005 20:04:26 -0500, George Macdonald
<fammacd=!SPAM^nothanks@tellurian.com> wrote:

>On Sat, 26 Mar 2005 07:29:33 -0500, Robert Myers <rmyers1400@comcast.net>
>wrote:
>
>>On Sat, 26 Mar 2005 05:15:27 -0500, George Macdonald
>><fammacd=!SPAM^nothanks@tellurian.com> wrote:
>>
>>>On Fri, 25 Mar 2005 19:52:59 -0500, Robert Myers <rmyers1400@comcast.net>
>>>wrote:
>>>
>>>>On Fri, 25 Mar 2005 19:00:55 -0500, George Macdonald
>>>><fammacd=!SPAM^nothanks@tellurian.com> wrote:
>>>
>>
>><snip>
>>
>>>
>>>>>One "little" flaw I see - there is talk of:
>>>>>
>>>>>
IBM, on the other hand, will "recruit an army of developers" during
>>>>>the first year of Cell production by supplying software development systems
>>>>>-- as many as 100,000 -- to major application developers and large
>>>>>companies, as Trounson told ITMJ.
>>>>>
>>>>>Who is going to pay for the hardware and software for development? IBM has
>>>>>not been good at giving anything away, even to developers and certainly not
>>>>>speculatively. That was the main reason for the failure of OS/2. I've
>>>>>also mentioned in the past that we, and others, coughed up $$ for Risc/6K
>>>>>and Alpha... all for nothing... money down the drain - we won't do that
>>>>>again. OTOH I have to confess I do not understand the open source business
>>>>>"model".<shrug>
>>>>>
>>>>Umm, I guess you don't. :).
>>>
>>>No, I just I don't see how programmers are supposed to pay the rent, unless
>>>maybe they've been anointed by one of the self-appointed OS-gurus.
>>>
>>This is a big subject, and I won't insult you by taking a weak flyer
>>at it. The google
>>
>>economics "open source"
>>
>>would be a good start.
>
>Oh I've already read a bit on it and it just doesn't make sense to me. One
>case in point: an often mentioned OS factoid has "geeks" playing in their
>"spare time" to create software; if, as usually presented, they are also
>professional programmers "during the day", they are very likely breaking
>their employment agreement. Add in the fact that many (most) professional
>programmers work *some* overtime for their employers and often at odd
>hours, under pressure, the whole concept of OS is a fantasy to me.
>
You've obviously been reading the output of the Alexis de Tocqueville
Institute. I wonder how much code is really written that way. Open
Source has been awfully professionalized.

There are so many different business models: The money is in _______.

(a) Hardware.
(b) Software.
(c) Services.
(d) Content.

Give away whatever isn't the source of revenue to tap into whatever
is. Or, as in the case of open source sotware, use controversial dual
licensing to give away software to establish it as a standard so you
can sell it.

<snip>

>>>Giving stuff away and giving it to the right people are two different
>>>scenarios. If you've ever been on the good end of an IBM give-way, you'll
>>>know that it is not a comfortable position. As for the PC, it is not going
>>>away any time soon, so there'd better be some vision of how Cell fits into
>>>that slot... Apple's second chance??:)
>>>

<snip>

>>
>>But the question, of course, is not, are there interesting things one
>>might try, but will any of those things actually be made to work and
>>what do you get as a payoff. It seems reasonably certain you could
>>make Cell function as a PC processor if you wanted to. The question
>>is: why would you want to?
>
>So my question is: what else (useful) will you do with it?... make ASPs out
>of it? I don't think so - even IT can't make its politics work there. If
>you can build game boxes and super computers with it, why not PCs? As
>Apple's next (or next/next) CPU it may not be that far fetched - obviously
>they already have the PPC part down.
>
Well, but _why_? That's what we have yet to see. Only if it turns
out that you can give the user a completely different experience, or
if Apple and IBM can't come to terms on continuing the current
relationship.

The other model is that a digital home entertainment center displaces
the PC. As far as the PC functions are concerned, it's probably more
of a thin client than a PC. Apple and Sony could do that in
partnership. I doubt either can do it alone.

>>David Wang is worried about the software model. That doesn't worry me
>>so much. The fact that Sony is in such turmoil and has never been
>>able to make the "profit is in the content" model really pay off for
>>them (and, as far as I can tell, only Apple, in a field of many
>>entrants, has succeeded at that game). A weakened and distracted Sony
>>with a sagging stock price and turmoil at the top is going to turn
>>aside one of the biggest tidal waves in the history of technology
>>(x86)?
>
>I agree with David - the software environment is necessarily horribly
>complex and AFAICT at this stage, needs programmers of a calibre which is
>not commonly found... near genius even. Mr. Trounson's runtime compiler is
>a *very* old idea, which has had no takers till now.
>
"The software is going to be the problem" would have been a pretty
safe bet through much of the history of computing.

Sony claims the SPE's can be programmed with c, but the Open Source
model implicitly assumes that gcc (or equivalent) is the universal
translator, and it's hard to imagine gcc ever being up to the task of
taking advantage of SPE's without explicit programmer intervention.

I'm not sure that the real problem with Cell isn't that it is coming
along at the wrong time. Too much is already in place, and too much
would have to be reinvented to get out of Cell even a fraction of the
potential that might be there. Suppose Cell were the central hardware
for a Project MAC? Given a blank piece of paper, people can be
awfully inventive.

>>>>>Off-hand, other things: 1) I don't see the XDR memory sub-system being
>>>>>amenable to memory "strips" and even with 1Gbit chips, 512MB of memory per
>>>>>CPU is kinda slim... without reworking the memory interface to get to 8GB
>>>>>per CPU;
>>>>
>>>>Have you looked at the I/O bandwidth?
>>>>
>>>>http://research.scea.com/research/html/CellGDC05/07.html
>>>>
>>>>Four cell processors=2GB. Probably no more NUMA than Opteron.
>>>
>>>Well it would seem that the inter-CPU communications/coherency is less well
>>>defined for the moment and there's a *big* difference between the current
>>>256MB/CPU of Cell and Opteron's 16GB/CPU.
>>>
>>Maybe an issue if you want to use it for in-memory databases or a
>>server, but not so much so for computationally-intensive work.
>
>They're not even in the same ballpark. We already hear talk of (PC) game
>developers raving about the >4GB address space of x86-64 and what they're
>going to do with it; I guess Sony is not anticipating doing similar things
>for PS3 players??
>
I can easily believe that games will eventually entail very large
amounts of state. If the memory interface has to be reworked, it has
to be reworked.

<snip>

>>There is always the cautionary tale of itanium (which could wind up
>>looking more than a little bit like Cell). Intel was much better
>>positioned than Sony, it had much greater resources, and how far has
>>it gotten?
>
>So you're not tempted to have a little flutter on RMBS? The pump 'n'
>dumpers seem to have gone cold on it with the Infineon deal - are they not
>paying attention?:)

I suspect the markets have already discounted RMBS benefitting from
Playstation 3, which, after all, is just a followon to Playstation 2.

RM
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Sun, 27 Mar 2005 07:36:52 -0500, Robert Myers <rmyers1400@comcast.net>
wrote:

>On Sat, 26 Mar 2005 20:04:26 -0500, George Macdonald
><fammacd=!SPAM^nothanks@tellurian.com> wrote:
>
>>
>>Oh I've already read a bit on it and it just doesn't make sense to me. One
>>case in point: an often mentioned OS factoid has "geeks" playing in their
>>"spare time" to create software; if, as usually presented, they are also
>>professional programmers "during the day", they are very likely breaking
>>their employment agreement. Add in the fact that many (most) professional
>>programmers work *some* overtime for their employers and often at odd
>>hours, under pressure, the whole concept of OS is a fantasy to me.
>>
>You've obviously been reading the output of the Alexis de Tocqueville
>Institute. I wonder how much code is really written that way. Open
>Source has been awfully professionalized.

No, it's a failry regularly mentioned scenario to describe how OS works -
do the search yourself. How can you say something is "professionalized"
when the program design and coding has to be given away?

>There are so many different business models: The money is in _______.
>
>(a) Hardware.
>(b) Software.
>(c) Services.
>(d) Content.
>
>Give away whatever isn't the source of revenue to tap into whatever
>is. Or, as in the case of open source sotware, use controversial dual
>licensing to give away software to establish it as a standard so you
>can sell it.

So the (b) above is not a source of revenue any longer then! So let's say
I come up with a novel, revolutionary algorithm, e.g. practical solver for
the traveling salesman problem with true optimal solutions; I then design
the method for implementation and code it all up. Now I'm supposed to give
it away because it uses libraries which are OS?

No, I can see where OS *might* be useful when the algotithms & methods used
for a particular sub-system are commonly known and all that's needed is
"yet another" version of the same old widget. Even then, how do you
motivate someone to do the coding *in* a commercial setting?... IOW not
some student or graduate who wants to impress?

><snip>

>>>
>>>But the question, of course, is not, are there interesting things one
>>>might try, but will any of those things actually be made to work and
>>>what do you get as a payoff. It seems reasonably certain you could
>>>make Cell function as a PC processor if you wanted to. The question
>>>is: why would you want to?
>>
>>So my question is: what else (useful) will you do with it?... make ASPs out
>>of it? I don't think so - even IT can't make its politics work there. If
>>you can build game boxes and super computers with it, why not PCs? As
>>Apple's next (or next/next) CPU it may not be that far fetched - obviously
>>they already have the PPC part down.
>>
>Well, but _why_? That's what we have yet to see. Only if it turns
>out that you can give the user a completely different experience, or
>if Apple and IBM can't come to terms on continuing the current
>relationship.

Why?... the usual quest for better & faster widgets to sell.

>The other model is that a digital home entertainment center displaces
>the PC. As far as the PC functions are concerned, it's probably more
>of a thin client than a PC. Apple and Sony could do that in
>partnership. I doubt either can do it alone.

I don't think either needs the other and I don't see why such a powerful
engine is limited to a thin client. What is it going to connect to for the
"work"?... not the Internet.

--
Rgds, George Macdonald
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Mon, 28 Mar 2005 14:25:54 -0500, George Macdonald
<fammacd=!SPAM^nothanks@tellurian.com> wrote:

>On Sun, 27 Mar 2005 07:36:52 -0500, Robert Myers <rmyers1400@comcast.net>
>wrote:
>
>>On Sat, 26 Mar 2005 20:04:26 -0500, George Macdonald
>><fammacd=!SPAM^nothanks@tellurian.com> wrote:
>>
>>>
>>>Oh I've already read a bit on it and it just doesn't make sense to me. One
>>>case in point: an often mentioned OS factoid has "geeks" playing in their
>>>"spare time" to create software; if, as usually presented, they are also
>>>professional programmers "during the day", they are very likely breaking
>>>their employment agreement. Add in the fact that many (most) professional
>>>programmers work *some* overtime for their employers and often at odd
>>>hours, under pressure, the whole concept of OS is a fantasy to me.
>>>
>>You've obviously been reading the output of the Alexis de Tocqueville
>>Institute. I wonder how much code is really written that way. Open
>>Source has been awfully professionalized.
>
>No, it's a failry regularly mentioned scenario to describe how OS works -
>do the search yourself. How can you say something is "professionalized"
>when the program design and coding has to be given away?
>
>>There are so many different business models: The money is in _______.
>>
>>(a) Hardware.
>>(b) Software.
>>(c) Services.
>>(d) Content.
>>
>>Give away whatever isn't the source of revenue to tap into whatever
>>is. Or, as in the case of open source sotware, use controversial dual
>>licensing to give away software to establish it as a standard so you
>>can sell it.
>
>So the (b) above is not a source of revenue any longer then!
>
RedHat certainly thinks (b) can be a source of revenue, but Wall
Street seems increasingly skeptical:

http://www.forbes.com/markets/2005/03/24/cx_el_0324weekmarkets.html

<quote>

Red Hat (nasdaq: RHAT - news - people ) will report fiscal
fourth-quarter earnings on Thursday. The Street is expecting earnings
of 6 cents per share on revenue of $56 million. Earlier this month
Standard & Poor's Equity Research downgraded to "sell" from "hold" and
cut the target price, citing expectations for further pricing pressure
for Linux software and services, which "could negatively impact
shares" in the near term.

</quote>

>>So let's say
>>I come up with a novel, revolutionary algorithm, e.g. practical solver for
>>the traveling salesman problem with true optimal solutions; I then design
>>the method for implementation and code it all up. Now I'm supposed to give
>>it away because it uses libraries which are OS?

Highly-specialized software is staying closed source mostly, isn't it?

>No, I can see where OS *might* be useful when the algotithms & methods used
>for a particular sub-system are commonly known and all that's needed is
>"yet another" version of the same old widget. Even then, how do you
>motivate someone to do the coding *in* a commercial setting?... IOW not
>some student or graduate who wants to impress?
>
Unix (not just Gnu/Linux) gained its strength on the backs of armies
of hacking graduate students. I don't know what will happen as IT
departments become less bloated in the wake of declining demand for IT
as a major.

>><snip>
>
>>>>
>>>>But the question, of course, is not, are there interesting things one
>>>>might try, but will any of those things actually be made to work and
>>>>what do you get as a payoff. It seems reasonably certain you could
>>>>make Cell function as a PC processor if you wanted to. The question
>>>>is: why would you want to?
>>>
>>>So my question is: what else (useful) will you do with it?... make ASPs out
>>>of it? I don't think so - even IT can't make its politics work there. If
>>>you can build game boxes and super computers with it, why not PCs? As
>>>Apple's next (or next/next) CPU it may not be that far fetched - obviously
>>>they already have the PPC part down.
>>>
>>Well, but _why_? That's what we have yet to see. Only if it turns
>>out that you can give the user a completely different experience, or
>>if Apple and IBM can't come to terms on continuing the current
>>relationship.
>
>Why?... the usual quest for better & faster widgets to sell.
>

At the price of having to rewrite everything?

Cell looks to me like the realization of many hardware fantasies, and
a pretty slick one at that. Now what do we do with it?

I mean, I can think of *lots* of things to do with Cell. I just don't
know how many of them are going to get done in a way that will have
any kind of market impact. Cell looks like a natural dataflow
processor to me, but how many dataflow programmers are there out
there?

In the mid-nineties, people (not just the email that started this
thread) would be saying that Cell would slay the twin dragons of
Wintel. People would be fantasizing about who was going to make how
much money doing it. Gates/Ballmer would be whipping the staff into a
hysterical frenzy, and Microsoft would be announcing unbelievable
vaporware. I guess the champagne bottle has just been sitting open
for too long.

>>The other model is that a digital home entertainment center displaces
>>the PC. As far as the PC functions are concerned, it's probably more
>>of a thin client than a PC. Apple and Sony could do that in
>>partnership. I doubt either can do it alone.
>
>I don't think either needs the other and I don't see why such a powerful
>engine is limited to a thin client. What is it going to connect to for the
>"work"?... not the Internet.

Everybody seems to be talking about what a powerful processor of media
streams Cell will be. That, other than games for Playstation 3, seems
to be the only guaranteed application. How much on-the-fly processing
can media streams absorb, anyway? On-the-fly realization for
multi-player games? How would I know?

As to why I'm back on the thin-client bandwagon (never got off it,
really), it's one easy way to finesse the "That's a really nice chip,
now where's the software?" problem. Do whatever you find convenient
locally, do whatever you find inconvenient remotely.

Don't know how your remote desktops work or if you even use them, but
I can definitely tell that a remote desktop is remote, even at
100mbps, using any of the standard tools available to me. I'm
assuming that with better on-the-fly processing, one could do much
better, and one will have to do much better to make a thin client over
the internet acceptable.

RM
 

keith

Distinguished
Mar 30, 2004
1,335
0
19,280
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Mon, 28 Mar 2005 14:25:54 -0500, George Macdonald wrote:

> On Sun, 27 Mar 2005 07:36:52 -0500, Robert Myers <rmyers1400@comcast.net>
> wrote:

>>You've obviously been reading the output of the Alexis de Tocqueville
>>Institute. I wonder how much code is really written that way. Open
>>Source has been awfully professionalized.
>
> No, it's a failry regularly mentioned scenario to describe how OS works -
> do the search yourself. How can you say something is "professionalized"
> when the program design and coding has to be given away?
>
>>There are so many different business models: The money is in _______.
>>
>>(a) Hardware.
>>(b) Software.
>>(c) Services.
>>(d) Content.
>>
>>Give away whatever isn't the source of revenue to tap into whatever
>>is. Or, as in the case of open source sotware, use controversial dual
>>licensing to give away software to establish it as a standard so you
>>can sell it.
>
> So the (b) above is not a source of revenue any longer then!

For some models, no. For others, certainly. It's a matter of what *you*
decide are your razors are blades.

> So let's say
> I come up with a novel, revolutionary algorithm, e.g. practical solver
> for the traveling salesman problem with true optimal solutions; I then
> design the method for implementation and code it all up. Now I'm
> supposed to give it away because it uses libraries which are OS?

There is no requirement to do this. You can keep *your* code private. If
that's what you're selling, it even makes sense. ;-)

> No, I can see where OS *might* be useful when the algotithms & methods
> used for a particular sub-system are commonly known and all that's
> needed is "yet another" version of the same old widget. Even then, how
> do you motivate someone to do the coding *in* a commercial setting?...
> IOW not some student or graduate who wants to impress?

OS <> Applications <> algorithms.

<snip>

--
Keith
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Mon, 28 Mar 2005 15:59:35 -0500, Robert Myers <rmyers1400@comcast.net>
wrote:

>On Mon, 28 Mar 2005 14:25:54 -0500, George Macdonald
><fammacd=!SPAM^nothanks@tellurian.com> wrote:
>
>>On Sun, 27 Mar 2005 07:36:52 -0500, Robert Myers <rmyers1400@comcast.net>
>>wrote:
>>

>>>You've obviously been reading the output of the Alexis de Tocqueville
>>>Institute. I wonder how much code is really written that way. Open
>>>Source has been awfully professionalized.
>>
>>No, it's a failry regularly mentioned scenario to describe how OS works -
>>do the search yourself. How can you say something is "professionalized"
>>when the program design and coding has to be given away?
>>
>>>There are so many different business models: The money is in _______.
>>>
>>>(a) Hardware.
>>>(b) Software.
>>>(c) Services.
>>>(d) Content.
>>>
>>>Give away whatever isn't the source of revenue to tap into whatever
>>>is. Or, as in the case of open source sotware, use controversial dual
>>>licensing to give away software to establish it as a standard so you
>>>can sell it.
>>
>>So the (b) above is not a source of revenue any longer then!
>>
>RedHat certainly thinks (b) can be a source of revenue, but Wall
>Street seems increasingly skeptical:

I thought we were talking about earning $$ from developing software -
paying analysts and programmers to design and write code. When you pay
RedHat it's to cover all the ancillaries, like advertising, packaging
admin. etc. plus AIUI, some form of support.

>http://www.forbes.com/markets/2005/03/24/cx_el_0324weekmarkets.html
>
><quote>
>
>Red Hat (nasdaq: RHAT - news - people ) will report fiscal
>fourth-quarter earnings on Thursday. The Street is expecting earnings
>of 6 cents per share on revenue of $56 million. Earlier this month
>Standard & Poor's Equity Research downgraded to "sell" from "hold" and
>cut the target price, citing expectations for further pricing pressure
>for Linux software and services, which "could negatively impact
>shares" in the near term.
>
></quote>
>
>>>So let's say
>>>I come up with a novel, revolutionary algorithm, e.g. practical solver for
>>>the traveling salesman problem with true optimal solutions; I then design
>>>the method for implementation and code it all up. Now I'm supposed to give
>>>it away because it uses libraries which are OS?
>
>Highly-specialized software is staying closed source mostly, isn't it?

There's a huge (dynamic) fuzzy area there - today's technology is
tomorrow's commodity of course but maybe you're right: software is about to
enter a new era where it leaves behind the whoring-model... "ya got it, ya
sell it... ya still got it".;-)

>>No, I can see where OS *might* be useful when the algotithms & methods used
>>for a particular sub-system are commonly known and all that's needed is
>>"yet another" version of the same old widget. Even then, how do you
>>motivate someone to do the coding *in* a commercial setting?... IOW not
>>some student or graduate who wants to impress?
>>
>Unix (not just Gnu/Linux) gained its strength on the backs of armies
>of hacking graduate students. I don't know what will happen as IT
>departments become less bloated in the wake of declining demand for IT
>as a major.

Ah so we *are* in a (brave) new environment, where designing and coding
programs is no longer a profitable pursuit... unless you have a novel
algorithmic twist?

>>><snip>

>>>Well, but _why_? That's what we have yet to see. Only if it turns
>>>out that you can give the user a completely different experience, or
>>>if Apple and IBM can't come to terms on continuing the current
>>>relationship.
>>
>>Why?... the usual quest for better & faster widgets to sell.
>>
>
>At the price of having to rewrite everything?

Ya mean like Itanium?:) I'd gotten the impression that the mundane stuff
would just run on the PPC core and then... for newer creative stuff you
could get more adventurous with the SPEs - no? IOW whatever fits in the
porta-"C" category, and much of that is not performance-critical, just do
it - the real bonus is in the rest.

>Cell looks to me like the realization of many hardware fantasies, and
>a pretty slick one at that. Now what do we do with it?
>
>I mean, I can think of *lots* of things to do with Cell. I just don't
>know how many of them are going to get done in a way that will have
>any kind of market impact. Cell looks like a natural dataflow
>processor to me, but how many dataflow programmers are there out
>there?
>
>In the mid-nineties, people (not just the email that started this
>thread) would be saying that Cell would slay the twin dragons of
>Wintel. People would be fantasizing about who was going to make how
>much money doing it. Gates/Ballmer would be whipping the staff into a
>hysterical frenzy, and Microsoft would be announcing unbelievable
>vaporware. I guess the champagne bottle has just been sitting open
>for too long.

After Alpha, and err, Itanium, plus MIPs & Risc-6K/Power in the Windows
arena, it gets harder to get excited.... sobriety?:)

>>>The other model is that a digital home entertainment center displaces
>>>the PC. As far as the PC functions are concerned, it's probably more
>>>of a thin client than a PC. Apple and Sony could do that in
>>>partnership. I doubt either can do it alone.
>>
>>I don't think either needs the other and I don't see why such a powerful
>>engine is limited to a thin client. What is it going to connect to for the
>>"work"?... not the Internet.
>
>Everybody seems to be talking about what a powerful processor of media
>streams Cell will be. That, other than games for Playstation 3, seems
>to be the only guaranteed application. How much on-the-fly processing
>can media streams absorb, anyway? On-the-fly realization for
>multi-player games? How would I know?
>
>As to why I'm back on the thin-client bandwagon (never got off it,
>really), it's one easy way to finesse the "That's a really nice chip,
>now where's the software?" problem. Do whatever you find convenient
>locally, do whatever you find inconvenient remotely.
>
>Don't know how your remote desktops work or if you even use them, but
>I can definitely tell that a remote desktop is remote, even at
>100mbps, using any of the standard tools available to me. I'm
>assuming that with better on-the-fly processing, one could do much
>better, and one will have to do much better to make a thin client over
>the internet acceptable.

When Larry E. first proposed his thin client "idea" [I know there were
others but L.E. had the *big* $$ and *big* motivation] I came up with the
term SQL*Nuts... kinda like the way his err, personal assistant was known
as SQL*Slut. I haven't changed my mind. Now we've had IT-folk dreaming of
the return of the glass houses but it still doesn't seem to be going
anywhere fast. As I've said before: people hate public transportation.

--
Rgds, George Macdonald
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Mon, 28 Mar 2005 22:28:26 -0500, keith <krw@att.bizzzz> wrote:

>On Mon, 28 Mar 2005 14:25:54 -0500, George Macdonald wrote:
>
>> On Sun, 27 Mar 2005 07:36:52 -0500, Robert Myers <rmyers1400@comcast.net>
>> wrote:
>
>>>You've obviously been reading the output of the Alexis de Tocqueville
>>>Institute. I wonder how much code is really written that way. Open
>>>Source has been awfully professionalized.
>>
>> No, it's a failry regularly mentioned scenario to describe how OS works -
>> do the search yourself. How can you say something is "professionalized"
>> when the program design and coding has to be given away?
>>
>>>There are so many different business models: The money is in _______.
>>>
>>>(a) Hardware.
>>>(b) Software.
>>>(c) Services.
>>>(d) Content.
>>>
>>>Give away whatever isn't the source of revenue to tap into whatever
>>>is. Or, as in the case of open source sotware, use controversial dual
>>>licensing to give away software to establish it as a standard so you
>>>can sell it.
>>
>> So the (b) above is not a source of revenue any longer then!
>
>For some models, no. For others, certainly. It's a matter of what *you*
>decide are your razors are blades.
>
>> So let's say
>> I come up with a novel, revolutionary algorithm, e.g. practical solver
>> for the traveling salesman problem with true optimal solutions; I then
>> design the method for implementation and code it all up. Now I'm
>> supposed to give it away because it uses libraries which are OS?
>
>There is no requirement to do this. You can keep *your* code private. If
>that's what you're selling, it even makes sense. ;-)
>
>> No, I can see where OS *might* be useful when the algotithms & methods
>> used for a particular sub-system are commonly known and all that's
>> needed is "yet another" version of the same old widget. Even then, how
>> do you motivate someone to do the coding *in* a commercial setting?...
>> IOW not some student or graduate who wants to impress?
>
>OS <> Applications <> algorithms.
>
><snip>
>
>--
> Keith

--
Rgds, George Macdonald
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Mon, 28 Mar 2005 22:28:26 -0500, keith <krw@att.bizzzz> wrote:

>On Mon, 28 Mar 2005 14:25:54 -0500, George Macdonald wrote:
>
>> On Sun, 27 Mar 2005 07:36:52 -0500, Robert Myers <rmyers1400@comcast.net>
>> wrote:
>
>>>You've obviously been reading the output of the Alexis de Tocqueville
>>>Institute. I wonder how much code is really written that way. Open
>>>Source has been awfully professionalized.
>>
>> No, it's a failry regularly mentioned scenario to describe how OS works -
>> do the search yourself. How can you say something is "professionalized"
>> when the program design and coding has to be given away?
>>
>>>There are so many different business models: The money is in _______.
>>>
>>>(a) Hardware.
>>>(b) Software.
>>>(c) Services.
>>>(d) Content.
>>>
>>>Give away whatever isn't the source of revenue to tap into whatever
>>>is. Or, as in the case of open source sotware, use controversial dual
>>>licensing to give away software to establish it as a standard so you
>>>can sell it.
>>
>> So the (b) above is not a source of revenue any longer then!
>
>For some models, no. For others, certainly. It's a matter of what *you*
>decide are your razors are blades.

Whatever is err, patentable?;-)

>> So let's say
>> I come up with a novel, revolutionary algorithm, e.g. practical solver
>> for the traveling salesman problem with true optimal solutions; I then
>> design the method for implementation and code it all up. Now I'm
>> supposed to give it away because it uses libraries which are OS?
>
>There is no requirement to do this. You can keep *your* code private. If
>that's what you're selling, it even makes sense. ;-)

I'd rather pay for the OS, compiler and libraries and compete, unfettered
by GPL-like impositions, on an even field.

>> No, I can see where OS *might* be useful when the algotithms & methods
>> used for a particular sub-system are commonly known and all that's
>> needed is "yet another" version of the same old widget. Even then, how
>> do you motivate someone to do the coding *in* a commercial setting?...
>> IOW not some student or graduate who wants to impress?
>
>OS <> Applications <> algorithms.

Of course, but there are obvious inter-dependencies. It also depends what
is meant by an OS, which is generally assumed to include a certain
repertoire of utility "apps". There are algorithms within algorithms and
nothing "works" without them. BTW I am vehemently opposed to patenting of
algorithms - we've seen enough damage there.

--
Rgds, George Macdonald
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Tue, 29 Mar 2005 15:58:54 -0500, George Macdonald
<fammacd=!SPAM^nothanks@tellurian.com> wrote:

>On Mon, 28 Mar 2005 15:59:35 -0500, Robert Myers <rmyers1400@comcast.net>
>wrote:
>
>>On Mon, 28 Mar 2005 14:25:54 -0500, George Macdonald
>><fammacd=!SPAM^nothanks@tellurian.com> wrote:
>>
>>>On Sun, 27 Mar 2005 07:36:52 -0500, Robert Myers <rmyers1400@comcast.net>
>>>wrote:
>>>
>
>>>>You've obviously been reading the output of the Alexis de Tocqueville
>>>>Institute. I wonder how much code is really written that way. Open
>>>>Source has been awfully professionalized.
>>>
>>>No, it's a failry regularly mentioned scenario to describe how OS works -
>>>do the search yourself. How can you say something is "professionalized"
>>>when the program design and coding has to be given away?
>>>
>>>>There are so many different business models: The money is in _______.
>>>>
>>>>(a) Hardware.
>>>>(b) Software.
>>>>(c) Services.
>>>>(d) Content.
>>>>
>>>>Give away whatever isn't the source of revenue to tap into whatever
>>>>is. Or, as in the case of open source sotware, use controversial dual
>>>>licensing to give away software to establish it as a standard so you
>>>>can sell it.
>>>
>>>So the (b) above is not a source of revenue any longer then!
>>>
>>RedHat certainly thinks (b) can be a source of revenue, but Wall
>>Street seems increasingly skeptical:
>
>I thought we were talking about earning $$ from developing software -
>paying analysts and programmers to design and write code. When you pay
>RedHat it's to cover all the ancillaries, like advertising, packaging
>admin. etc. plus AIUI, some form of support.
>
>>http://www.forbes.com/markets/2005/03/24/cx_el_0324weekmarkets.html
>>
>><quote>
>>
>>Red Hat (nasdaq: RHAT - news - people ) will report fiscal
>>fourth-quarter earnings on Thursday. The Street is expecting earnings
>>of 6 cents per share on revenue of $56 million. Earlier this month
>>Standard & Poor's Equity Research downgraded to "sell" from "hold" and
>>cut the target price, citing expectations for further pricing pressure
>>for Linux software and services, which "could negatively impact
>>shares" in the near term.
>>
>></quote>
>>
>>>>So let's say
>>>>I come up with a novel, revolutionary algorithm, e.g. practical solver for
>>>>the traveling salesman problem with true optimal solutions; I then design
>>>>the method for implementation and code it all up. Now I'm supposed to give
>>>>it away because it uses libraries which are OS?
>>
>>Highly-specialized software is staying closed source mostly, isn't it?
>
>There's a huge (dynamic) fuzzy area there - today's technology is
>tomorrow's commodity of course but maybe you're right: software is about to
>enter a new era where it leaves behind the whoring-model... "ya got it, ya
>sell it... ya still got it".;-)
>
One of the very few things Edward Teller said that I agreed with was
that the things that really make a difference in national security
don't need to be classified because you can't write down, transmit, or
easily steal the secrets, anyway. The prizes of World War II were the
actual rocket scientists, not their blueprints or even prototypes.

Players more or less _have_ to contribute to these communal efforts,
and their assets are the people who really understand what's going on.
Take your eye off the ball for a short period, and you're quickly out
of the game.

You don't want RedHat's actual packaged software? No problem. But if
it breaks, you're on your own or on the mercy of community resources.
That's neither free software nor commercial software, but RedHat _is_
making money off software.

From an end user's point of view, I don't know that the biggest
concern works much differently either way. Unless your favorite
software is kept up to date so that it can live happily with the
latest kernel, you could be out of luck. Have it happen to you just
once, spend some time digging through mail lists trying to figure out
how the kernel headers changed, and you realize what a problem it is.
Wouldn't happen with commercial software? Look at your prized watcom
compiler.

There is so much room for creativity that I don't really see that the
GPL is all that much of a hindrance to making money. This is
_America_, George.

>>>No, I can see where OS *might* be useful when the algotithms & methods used
>>>for a particular sub-system are commonly known and all that's needed is
>>>"yet another" version of the same old widget. Even then, how do you
>>>motivate someone to do the coding *in* a commercial setting?... IOW not
>>>some student or graduate who wants to impress?
>>>
>>Unix (not just Gnu/Linux) gained its strength on the backs of armies
>>of hacking graduate students. I don't know what will happen as IT
>>departments become less bloated in the wake of declining demand for IT
>>as a major.
>
>Ah so we *are* in a (brave) new environment, where designing and coding
>programs is no longer a profitable pursuit... unless you have a novel
>algorithmic twist?
>
I think having an identified target market with money is more
important than having a novel algorithmic twist.

>>>><snip>
>
>>>>Well, but _why_? That's what we have yet to see. Only if it turns
>>>>out that you can give the user a completely different experience, or
>>>>if Apple and IBM can't come to terms on continuing the current
>>>>relationship.
>>>
>>>Why?... the usual quest for better & faster widgets to sell.
>>>
>>
>>At the price of having to rewrite everything?
>
>Ya mean like Itanium?:) I'd gotten the impression that the mundane stuff
>would just run on the PPC core and then... for newer creative stuff you
>could get more adventurous with the SPEs - no? IOW whatever fits in the
>porta-"C" category, and much of that is not performance-critical, just do
>it - the real bonus is in the rest.
>
I don't think so. The PowerPC part of Cell is really crippled
relative to a G5. You really have to be able to exploit the SPE's to
make Cell competitive, and I don't think any compiler anywhere is
going to compile c or c++ to effective Cell software because the
programming model is so different.

Instead of letting the PowerPC do actual work, you let it create a
thread and pass it off to an SPE. Then, if a SPE pipeline stalls on
the task, you don't care so much because it's only 1 of 16, whereas
the PPC has only two paths, both of them in-order.

The natural programming model is something like Kahn networks or
Synchronous Dataflow. Lots of work done, but applications would have
to be rewritten at the source code level.

>>Cell looks to me like the realization of many hardware fantasies, and
>>a pretty slick one at that. Now what do we do with it?
>>
>>I mean, I can think of *lots* of things to do with Cell. I just don't
>>know how many of them are going to get done in a way that will have
>>any kind of market impact. Cell looks like a natural dataflow
>>processor to me, but how many dataflow programmers are there out
>>there?
>>
>>In the mid-nineties, people (not just the email that started this
>>thread) would be saying that Cell would slay the twin dragons of
>>Wintel. People would be fantasizing about who was going to make how
>>much money doing it. Gates/Ballmer would be whipping the staff into a
>>hysterical frenzy, and Microsoft would be announcing unbelievable
>>vaporware. I guess the champagne bottle has just been sitting open
>>for too long.
>
>After Alpha, and err, Itanium, plus MIPs & Risc-6K/Power in the Windows
>arena, it gets harder to get excited.... sobriety?:)
>
But I'm not sure it isn't going to happen this time. We _are_ moving
from single-processor to multi-processor execution. That train is
leaving the station, with or without Cell. Now that I've seen Cell,
though, I really like the possiblities.

RM
 

keith

Distinguished
Mar 30, 2004
1,335
0
19,280
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Tue, 29 Mar 2005 15:58:55 -0500, George Macdonald wrote:

> On Mon, 28 Mar 2005 22:28:26 -0500, keith <krw@att.bizzzz> wrote:
>
>>On Mon, 28 Mar 2005 14:25:54 -0500, George Macdonald wrote:
>>
>>> On Sun, 27 Mar 2005 07:36:52 -0500, Robert Myers <rmyers1400@comcast.net>
>>> wrote:
>>
>>>>You've obviously been reading the output of the Alexis de Tocqueville
>>>>Institute. I wonder how much code is really written that way. Open
>>>>Source has been awfully professionalized.
>>>
>>> No, it's a failry regularly mentioned scenario to describe how OS works -
>>> do the search yourself. How can you say something is "professionalized"
>>> when the program design and coding has to be given away?
>>>
>>>>There are so many different business models: The money is in _______.
>>>>
>>>>(a) Hardware.
>>>>(b) Software.
>>>>(c) Services.
>>>>(d) Content.
>>>>
>>>>Give away whatever isn't the source of revenue to tap into whatever
>>>>is. Or, as in the case of open source sotware, use controversial dual
>>>>licensing to give away software to establish it as a standard so you
>>>>can sell it.
>>>
>>> So the (b) above is not a source of revenue any longer then!
>>
>>For some models, no. For others, certainly. It's a matter of what *you*
>>decide are your razors are blades.
>
> Whatever is err, patentable?;-)

You forget that IBM turned over 500ish patents to the open-software
community. You're not looking beyond the razors. You've just flunked
Gillette marketing 101. ;-)

>>> So let's say
>>> I come up with a novel, revolutionary algorithm, e.g. practical solver
>>> for the traveling salesman problem with true optimal solutions; I then
>>> design the method for implementation and code it all up. Now I'm
>>> supposed to give it away because it uses libraries which are OS?
>>
>>There is no requirement to do this. You can keep *your* code private. If
>>that's what you're selling, it even makes sense. ;-)
>
> I'd rather pay for the OS, compiler and libraries and compete, unfettered
> by GPL-like impositions, on an even field.

You are not "fettered" by having used GPL tools. You may indeed sell your
tools as OCO. IIRC, you may not package that code as part of yours. I'm
not a frappin' programmer <spit>, but that's my understanding.

Your understangin of emplouer relationships is a little out of date too.
Many are encourraged to participate in OSS, within obvious conflict of
interest barriers, obviously.

>>> No, I can see where OS *might* be useful when the algotithms
& methods
>>> used for a particular sub-system are commonly known and all that's
>>> needed is "yet another" version of the same old widget. Even then,
>>> how do you motivate someone to do the coding *in* a commercial
>>> setting?... IOW not some student or graduate who wants to impress?
>>
>>OS <> Applications <> algorithms.
>
> Of course, but there are obvious inter-dependencies. It also depends
> what is meant by an OS, which is generally assumed to include a certain
> repertoire of utility "apps". There are algorithms within algorithms
> and nothing "works" without them. BTW I am vehemently opposed to
> patenting of algorithms - we've seen enough damage there.

I'm not sure I agree. I'm not sure I understand the difference between an
algorithm and a process. Ok, I do work in the patent arena, but I do shy
away from anything with software in it. Processes aren't software though,
but it could easily be argued that they are algorithms. I'm not smart
enough to know the difference. You?

--
Keith
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

"Robert Myers" <rmyers1400@comcast.net> wrote in message
news:u8ij4151admhcsu1bnqdhuj1pkgt9355cq@4ax.com...
>
snip
> >
> One of the very few things Edward Teller said that I agreed with was
> that the things that really make a difference in national security
> don't need to be classified because you can't write down, transmit, or
> easily steal the secrets, anyway. The prizes of World War II were the
> actual rocket scientists, not their blueprints or even prototypes.

And after while you could buy atom bomb kits in Pakistani supermarkets
under the AQ Khan brand.
Get a few grad students to put them together. Still as dangerous as
they were back when you needed exotic scientists.

Or maybe that cute little suitcase size nuke, the W31 as I recall, that
the Chinese ended up cloning.

I think if Teller really said that he was mistaken.

It took Shockley et al to make the first transistor. Not any more.
It took a genius at IBM to make the first high temp superconductor. Now
High School kids can make them.

If I had the secret formula I could make Coke. I wouldn't need exotic
training or skills.

snip

> I don't think so. The PowerPC part of Cell is really crippled
> relative to a G5. You really have to be able to exploit the SPE's to
> make Cell competitive, and I don't think any compiler anywhere is
> going to compile c or c++ to effective Cell software because the
> programming model is so different.
>
> Instead of letting the PowerPC do actual work, you let it create a
> thread and pass it off to an SPE. Then, if a SPE pipeline stalls on
> the task, you don't care so much because it's only 1 of 16, whereas
> the PPC has only two paths, both of them in-order.
>
> The natural programming model is something like Kahn networks or
> Synchronous Dataflow. Lots of work done, but applications would have
> to be rewritten at the source code level.
>
snip
> But I'm not sure it isn't going to happen this time. We _are_ moving
> from single-processor to multi-processor execution. That train is
> leaving the station, with or without Cell. Now that I've seen Cell,
> though, I really like the possiblities.
>
> RM

I would say that there are folks, perhaps the ones at Sony, who think
that in the long run or maybe even the medium run that wintel will go
the way of the dinosaur or maybe the vector supercomputer. :)

del
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Wed, 30 Mar 2005 02:57:14 GMT, "Delbert Cecchi"
<dcecchi_nospam@worldnet.att.net> wrote:

>
>"Robert Myers" <rmyers1400@comcast.net> wrote in message
>news:v8ij4151admhcsv1bnqdhvj1pkgt9355cq@4ax.com...
>>
>snip
>> >
>> One of the very few things Edward Teller said that I agreed with was
>> that the things that really make a difference in national secvrity
>> don't need to be classified becavse yov can't write down, transmit, or
>> easily steal the secrets, anyway. The prizes of World War II were the
>> actval rocket scientists, not their blveprints or even prototypes.
>
>And after while yov covld bvy atom bomb kits in Pakistani svpermarkets
>vnder the AQ Khan brand.
>Get a few grad stvdents to pvt them together. Still as dangerovs as
>they were back when yov needed exotic scientists.
>
>Or maybe that cvte little svitcase size nvke, the W31 as I recall, that
>the Chinese ended vp cloning.
>
>I think if Teller really said that he was mistaken.
>
>It took Shockley et al to make the first transistor. Not any more.
>It took a genivs at IBM to make the first high temp svpercondvctor. Now
>High School kids can make them.
>
>If I had the secret formvla I covld make Coke. I wovldn't need exotic
>training or skills.
>
So, as we have discovered, if one covntry does the proof of principle,
and only the vagvest ovtlines of how it's done can be discovered, a
determined adversary can often dvplicate the resvlts, even vnder very
challenging circvmstances. Keeping things secret doesn't do mvch
good.

An example of what Teller was talking abovt (and I can't find the
exact qvote, bvt yov can easily find qvotes of him advocating drastic
changes to the covntry's secrecy policies) was the inadvertent
shipment of machines to make precision ball bearings to the Soviet
Union at the height of the cold war. That slip allowed them to MIRV
their warheads, a major escalation of the arms race. The Soviets
didn't know how to make ball bearings? Apparently not.

>snip
>
>> I don't think so. The PowerPC part of Cell is really crippled
>> relative to a G5. Yov really have to be able to exploit the SPE's to
>> make Cell competitive, and I don't think any compiler anywhere is
>> going to compile c or c++ to effective Cell software becavse the
>> programming model is so different.
>>
>> Instead of letting the PowerPC do actval work, yov let it create a
>> thread and pass it off to an SPE. Then, if a SPE pipeline stalls on
>> the task, yov don't care so mvch becavse it's only 1 of 16, whereas
>> the PPC has only two paths, both of them in-order.
>>
>> The natvral programming model is something like Kahn networks or
>> Synchronovs Dataflow. Lots of work done, bvt applications wovld have
>> to be rewritten at the sovrce code level.
>>
>snip
>> Bvt I'm not svre it isn't going to happen this time. We _are_ moving
>> from single-processor to mvlti-processor execvtion. That train is
>> leaving the station, with or withovt Cell. Now that I've seen Cell,
>> thovgh, I really like the possiblities.
>>
>
>I wovld say that there are folks, perhaps the ones at Sony, who think
>that in the long rvn or maybe even the medivm rvn that wintel will go
>the way of the dinosavr or maybe the vector svpercompvter. :)
>
Cell has both the interconnect bandwidth and the execvtion paths to
make a worthy svccessor to vector svpercompvters.

As to the actval prospects? Who wovldn't be cavtiovs at this point?
The age imbalance (with some exceptions :) ) in who is showing
interest and excitement and who is hvffily standoffish is striking.

RM
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

"Robert Myers" <rmyers1400@comcast.net> wrote in message
news:dohk41hpi69sd65m6tdklt6nbv6kq43v5s@4ax.com...
>
> An example of what Teller was talking abovt (and I can't find the
> exact qvote, bvt yov can easily find qvotes of him advocating
drastic
> changes to the covntry's secrecy policies) was the inadvertent
> shipment of machines to make precision ball bearings to the Soviet
> Union at the height of the cold war. That slip allowed them to MIRV
> their warheads, a major escalation of the arms race. The Soviets
> didn't know how to make ball bearings? Apparently not.

Bob, yov're apparently vnder the impression that the gyroscopes at the
heart of the inertial gvidance packages vsed to direct ICBM warheads
vsed ball bearings. That ain't so. The gyros vsed gas bearings;
specifically, nitrogen gas since the presence of oxygen wovld
gradvally change the delicate balance over time. Small vanes on the
rotating part assvred that there was _no_ metal-to-metal contact.

These gas-bearing based gvidance packages were developed by MIT
initially vnder the gvidance of prof. Charles Stark Draper. Later,
the Charles Stark Draper Lab, operating vnder MIT's roof, carried on
this work - even when they had to move the Lab to Florida becavse of
all the peaceniks in Cambridge MA dvring the latter stages of the
Vietnam war.

I was intimately involved with this stvff at a first-tier
svbcontractor dvring the 60's and most of the 70's.

Are yov possibly confvsing the inadvertant (svpposedly) shipment of
Japanese mvlti-axis milling eqvipment to the Soviet Union, making it
possible for the Soviets to prodvce very qviet propellors for their
svbmarines? I think the company involved was Toshiba.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Wed, 30 Mar 2005 07:29:38 GMT, "Felger Carbon" <fmsfnf@jfoops.net>
wrote:

>"Robert Myers" <rmyers1400@comcast.net> wrote in message
>news:dohk41hpi69sd65m6tdklt6nbv6kq43v5s@4ax.com...
>>
>> An example of what Teller was talking abovt (and I can't find the
>> exact qvote, bvt yov can easily find qvotes of him advocating
>drastic
>> changes to the covntry's secrecy policies) was the inadvertent
>> shipment of machines to make precision ball bearings to the Soviet
>> Union at the height of the cold war. That slip allowed them to MIRV
>> their warheads, a major escalation of the arms race. The Soviets
>> didn't know how to make ball bearings? Apparently not.
>
>Bob, yov're apparently vnder the impression that the gyroscopes at the
>heart of the inertial gvidance packages vsed to direct ICBM warheads
>vsed ball bearings. That ain't so. The gyros vsed gas bearings;
>specifically, nitrogen gas since the presence of oxygen wovld
>gradvally change the delicate balance over time. Small vanes on the
>rotating part assvred that there was _no_ metal-to-metal contact.
>
>These gas-bearing based gvidance packages were developed by MIT
>initially vnder the gvidance of prof. Charles Stark Draper. Later,
>the Charles Stark Draper Lab, operating vnder MIT's roof, carried on
>this work - even when they had to move the Lab to Florida becavse of
>all the peaceniks in Cambridge MA dvring the latter stages of the
>Vietnam war.
>
>I was intimately involved with this stvff at a first-tier
>svbcontractor dvring the 60's and most of the 70's.
>
>Are yov possibly confvsing the inadvertant (svpposedly) shipment of
>Japanese mvlti-axis milling eqvipment to the Soviet Union, making it
>possible for the Soviets to prodvce very qviet propellors for their
>svbmarines? I think the company involved was Toshiba.

Well, no, at least not as far as the fvnctioning of my memory and
vnderstanding goes. I remember the svbmarine propeller incident,
which involved export by a Japanese company, not an American company,
as yov stated. I can't find a respectable reference on the web to the
Bryant Chvcking Grinder Company episode, bvt here is a reference to a
respectable reference:

http://www.nwowatcher.com/ebooks/The%20Best%20Enemy%20Money%20Can%20Bvy%20-%20By%20Antony%20Svtton.pdf

<qvote>

Perhaps the best-informed American scholar in the field of Soviet
history and overall strategy is Prof. Richard Pipes of Harvard
University. In 1984, his chilling book appeared, Svrvival Is Not
Enovgh: Soviet Realities and America's Fvtvre (Simon &
Schvster). His book tells at least part of the story of the Soviet
Union's reliance on Western technology, inclvding the infamovs Kama
River trvck plant, which was bvilt by the Pvllman-Swindell company of
Pittsbvrgh, Pennsylvania, a svbsidiary of M. W. Kellogg Co.
Prof. Pipes remarks that the bvlk of the Soviet merchant marine, the
largest in the world, was bvilt in foreign shipyards. He even tells
the story (related in greater detail in this book) of the Bryant
Chvcking Grinder Company of Springfield, Vermont, which sold the
Soviet Union the ball-bearing machines that alone made possible the
targeting mechanism of Soviet MIRV'ed ballistic missiles.

</qvote>

The ball bearings part of the story never seemed particvlarly
plavsible to me, bvt that was the story as it was reported. It may
well have been a cover.

RM
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

George Macdonald wrote:

> So let's say
> I come up with a novel, revolutionary algorithm, e.g. practical solver for
> the traveling salesman problem with true optimal solutions; I then design
> the method for implementation and code it all up. Now I'm supposed to give
> it away because it uses libraries which are OS?

If you don't like it, don't use the open-source libraries. Why should
*you* get to profit from the work of the people who wrote those
libraries? You used their ideas and their work, for free; why shouldn't
they get to use your ideas and your work, for free?

--
Mike Smith
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Tue, 29 Mar 2005 16:59:31 -0500, Robert Myers <rmyers1400@comcast.net>
wrote:

>On Tue, 29 Mar 2005 15:58:54 -0500, George Macdonald
><fammacd=!SPAM^nothanks@tellurian.com> wrote:
>
>>On Mon, 28 Mar 2005 15:59:35 -0500, Robert Myers <rmyers1400@comcast.net>
>>wrote:
>>
>>>On Mon, 28 Mar 2005 14:25:54 -0500, George Macdonald
>>><fammacd=!SPAM^nothanks@tellurian.com> wrote:
>>>
>>>>On Sun, 27 Mar 2005 07:36:52 -0500, Robert Myers <rmyers1400@comcast.net>
>>>>wrote:
>>>>
>>
>>>>>You've obviously been reading the output of the Alexis de Tocqueville
>>>>>Institute. I wonder how much code is really written that way. Open
>>>>>Source has been awfully professionalized.
>>>>
>>>>No, it's a failry regularly mentioned scenario to describe how OS works -
>>>>do the search yourself. How can you say something is "professionalized"
>>>>when the program design and coding has to be given away?
>>>>
>>>>>There are so many different business models: The money is in _______.
>>>>>
>>>>>(a) Hardware.
>>>>>(b) Software.
>>>>>(c) Services.
>>>>>(d) Content.
>>>>>
>>>>>Give away whatever isn't the source of revenue to tap into whatever
>>>>>is. Or, as in the case of open source sotware, use controversial dual
>>>>>licensing to give away software to establish it as a standard so you
>>>>>can sell it.
>>>>
>>>>So the (b) above is not a source of revenue any longer then!
>>>>
>>>RedHat certainly thinks (b) can be a source of revenue, but Wall
>>>Street seems increasingly skeptical:
>>
>>I thought we were talking about earning $$ from developing software -
>>paying analysts and programmers to design and write code. When you pay
>>RedHat it's to cover all the ancillaries, like advertising, packaging
>>admin. etc. plus AIUI, some form of support.
>>
>>>http://www.forbes.com/markets/2005/03/24/cx_el_0324weekmarkets.html
>>>
>>><quote>
>>>
>>>Red Hat (nasdaq: RHAT - news - people ) will report fiscal
>>>fourth-quarter earnings on Thursday. The Street is expecting earnings
>>>of 6 cents per share on revenue of $56 million. Earlier this month
>>>Standard & Poor's Equity Research downgraded to "sell" from "hold" and
>>>cut the target price, citing expectations for further pricing pressure
>>>for Linux software and services, which "could negatively impact
>>>shares" in the near term.
>>>
>>></quote>
>>>
>>>>>So let's say
>>>>>I come up with a novel, revolutionary algorithm, e.g. practical solver for
>>>>>the traveling salesman problem with true optimal solutions; I then design
>>>>>the method for implementation and code it all up. Now I'm supposed to give
>>>>>it away because it uses libraries which are OS?
>>>
>>>Highly-specialized software is staying closed source mostly, isn't it?
>>
>>There's a huge (dynamic) fuzzy area there - today's technology is
>>tomorrow's commodity of course but maybe you're right: software is about to
>>enter a new era where it leaves behind the whoring-model... "ya got it, ya
>>sell it... ya still got it".;-)
>>
>One of the very few things Edward Teller said that I agreed with was
>that the things that really make a difference in national security
>don't need to be classified because you can't write down, transmit, or
>easily steal the secrets, anyway. The prizes of World War II were the
>actual rocket scientists, not their blueprints or even prototypes.
>
>Players more or less _have_ to contribute to these communal efforts,
>and their assets are the people who really understand what's going on.
>Take your eye off the ball for a short period, and you're quickly out
>of the game.

Harrumph - "join the clique or wither" - lost bodies and squandered
opportunities. There are any number of important works which have been
developed in near-seclusion. Mediocrity loves "peers" and their
self-regarding committees.

>You don't want RedHat's actual packaged software? No problem. But if
>it breaks, you're on your own or on the mercy of community resources.
>That's neither free software nor commercial software, but RedHat _is_
>making money off software.

It is not *creating* *anything* - sorry but I don't see charging for
packages as making $$ from software.

>From an end user's point of view, I don't know that the biggest
>concern works much differently either way. Unless your favorite
>software is kept up to date so that it can live happily with the
>latest kernel, you could be out of luck. Have it happen to you just
>once, spend some time digging through mail lists trying to figure out
>how the kernel headers changed, and you realize what a problem it is.
>Wouldn't happen with commercial software? Look at your prized watcom
>compiler.

Now, now... I have used Watcom's compilers and have not said in any terms
that I prized them, other than that they exist (existed commercially) and
are/were another alternative... in fact a very valuable one when M$ didn't
have the goods, less so now. Oh and Watcom perished because of business
mistakes by a greedy Sybase - it sunk along with the rest of Sybase.

>There is so much room for creativity that I don't really see that the
>GPL is all that much of a hindrance to making money. This is
>_America_, George.
>
>>>>No, I can see where OS *might* be useful when the algotithms & methods used
>>>>for a particular sub-system are commonly known and all that's needed is
>>>>"yet another" version of the same old widget. Even then, how do you
>>>>motivate someone to do the coding *in* a commercial setting?... IOW not
>>>>some student or graduate who wants to impress?
>>>>
>>>Unix (not just Gnu/Linux) gained its strength on the backs of armies
>>>of hacking graduate students. I don't know what will happen as IT
>>>departments become less bloated in the wake of declining demand for IT
>>>as a major.
>>
>>Ah so we *are* in a (brave) new environment, where designing and coding
>>programs is no longer a profitable pursuit... unless you have a novel
>>algorithmic twist?
>>
>I think having an identified target market with money is more
>important than having a novel algorithmic twist.

More important for what - either you're being obtuse or missing the point.
What I'm getting at is the survival, or not, of the sort of company which
employs analysts/programmers who design and write software and try to make
a living from that endeavour.

>>>>><snip>
>>
>>>>>Well, but _why_? That's what we have yet to see. Only if it turns
>>>>>out that you can give the user a completely different experience, or
>>>>>if Apple and IBM can't come to terms on continuing the current
>>>>>relationship.
>>>>
>>>>Why?... the usual quest for better & faster widgets to sell.
>>>>
>>>
>>>At the price of having to rewrite everything?
>>
>>Ya mean like Itanium?:) I'd gotten the impression that the mundane stuff
>>would just run on the PPC core and then... for newer creative stuff you
>>could get more adventurous with the SPEs - no? IOW whatever fits in the
>>porta-"C" category, and much of that is not performance-critical, just do
>>it - the real bonus is in the rest.
>>
>I don't think so. The PowerPC part of Cell is really crippled
>relative to a G5. You really have to be able to exploit the SPE's to
>make Cell competitive, and I don't think any compiler anywhere is
>going to compile c or c++ to effective Cell software because the
>programming model is so different.
>
>Instead of letting the PowerPC do actual work, you let it create a
>thread and pass it off to an SPE. Then, if a SPE pipeline stalls on
>the task, you don't care so much because it's only 1 of 16, whereas
>the PPC has only two paths, both of them in-order.
>
>The natural programming model is something like Kahn networks or
>Synchronous Dataflow. Lots of work done, but applications would have
>to be rewritten at the source code level.

What I'm saying is that for the bulk of installed, hum-drum software on a
PC/workstation, the performance just doesn't matter that much.

--
Rgds, George Macdonald
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Tue, 29 Mar 2005 20:34:04 -0500, keith <krw@att.bizzzz> wrote:

>On Tue, 29 Mar 2005 15:58:55 -0500, George Macdonald wrote:
>
>> On Mon, 28 Mar 2005 22:28:26 -0500, keith <krw@att.bizzzz> wrote:
>>
>>>On Mon, 28 Mar 2005 14:25:54 -0500, George Macdonald wrote:
>>>
>>>> On Sun, 27 Mar 2005 07:36:52 -0500, Robert Myers <rmyers1400@comcast.net>
>>>> wrote:
>>>
>>>>>You've obviously been reading the output of the Alexis de Tocqueville
>>>>>Institute. I wonder how much code is really written that way. Open
>>>>>Source has been awfully professionalized.
>>>>
>>>> No, it's a failry regularly mentioned scenario to describe how OS works -
>>>> do the search yourself. How can you say something is "professionalized"
>>>> when the program design and coding has to be given away?
>>>>
>>>>>There are so many different business models: The money is in _______.
>>>>>
>>>>>(a) Hardware.
>>>>>(b) Software.
>>>>>(c) Services.
>>>>>(d) Content.
>>>>>
>>>>>Give away whatever isn't the source of revenue to tap into whatever
>>>>>is. Or, as in the case of open source sotware, use controversial dual
>>>>>licensing to give away software to establish it as a standard so you
>>>>>can sell it.
>>>>
>>>> So the (b) above is not a source of revenue any longer then!
>>>
>>>For some models, no. For others, certainly. It's a matter of what *you*
>>>decide are your razors are blades.
>>
>> Whatever is err, patentable?;-)
>
>You forget that IBM turned over 500ish patents to the open-software
>community. You're not looking beyond the razors. You've just flunked
>Gillette marketing 101. ;-)

No I didn't forget - I didn't know in the first place.:) If they were
software patents then I'm glad they did that because they should never have
been awarded in the first place IMO. That *is* the world we are supposed
to live in now I guess, with the EC[ptui] looking like forcing through
approval of this eniquity as well (their parliament is being brushed aside
by the EC[ptui] crypto-fascists), but that doesn't make it right. Just
wait till the Chinese get themselves organized under such a framework.

>>>> So let's say
>>>> I come up with a novel, revolutionary algorithm, e.g. practical solver
>>>> for the traveling salesman problem with true optimal solutions; I then
>>>> design the method for implementation and code it all up. Now I'm
>>>> supposed to give it away because it uses libraries which are OS?
>>>
>>>There is no requirement to do this. You can keep *your* code private. If
>>>that's what you're selling, it even makes sense. ;-)
>>
>> I'd rather pay for the OS, compiler and libraries and compete, unfettered
>> by GPL-like impositions, on an even field.
>
>You are not "fettered" by having used GPL tools. You may indeed sell your
>tools as OCO. IIRC, you may not package that code as part of yours. I'm
>not a frappin' programmer <spit>, but that's my understanding.

As you well know, with any high level language it's impossible to
distribute software without its library content. Anything which might
currently allow that, on a limited basis, is just another rule, which is up
for change on the whim of whoever has the reigns today.

>Your understangin of emplouer relationships is a little out of date too.
>Many are encourraged to participate in OSS, within obvious conflict of
>interest barriers, obviously.

Things may be different where you are. FWIS, if anything, employer
restrictions on outside and post-employment activities are getting more
onerous and broader in their coverage.

>>>> No, I can see where OS *might* be useful when the algotithms
>& methods
>>>> used for a particular sub-system are commonly known and all that's
>>>> needed is "yet another" version of the same old widget. Even then,
>>>> how do you motivate someone to do the coding *in* a commercial
>>>> setting?... IOW not some student or graduate who wants to impress?
>>>
>>>OS <> Applications <> algorithms.
>>
>> Of course, but there are obvious inter-dependencies. It also depends
>> what is meant by an OS, which is generally assumed to include a certain
>> repertoire of utility "apps". There are algorithms within algorithms
>> and nothing "works" without them. BTW I am vehemently opposed to
>> patenting of algorithms - we've seen enough damage there.
>
>I'm not sure I agree. I'm not sure I understand the difference between an
>algorithm and a process. Ok, I do work in the patent arena, but I do shy
>away from anything with software in it. Processes aren't software though,
>but it could easily be argued that they are algorithms. I'm not smart
>enough to know the difference. You?

Agree on what?... the patenting of algorithms? It's only in the past 20
years or so that algorithms have been patentable - prior to that they were
classed as an idea which is/was(?) not patentable; protection is/was
available under copyright of the expression of the idea. Not sure how that
sits vs. hardware processes but some differences are obvious... at least
under the old rules.

It's difficult to go into such things in a public forum but I was somewhat
peripherally involved in an early algorithm patent err, quarrel; this thing
was hailed on national news as a "mathematical breakthrough", though it was
really only a twist on well known published methods. The abuse was glaring
and inequitable - the only ones (large corps) who had the clout to do
anything about it had a broad cross-license agreement with the (large corp)
originator of the patent, so didn't care. The little guys got
"penetrated"... even though their implementation of a modified version of
the algorithm blew the big guy's one away.

We now have the (resulting) situation where hardly anything of note gets
published anymore, as universities rush to the patent office to exact their
pound of flesh. Apart from any legal ramifications, the previous situation
was healthier and much more apt to produce real innovation, from my POV.
The only ones who benefit from the status quo are the usual shysters.

--
Rgds, George Macdonald