Intel drops HyperThreading

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Mon, 29 Aug 2005 16:28:07 -0700, "David Schwartz"
<davids@webmaster.com> wrote:


> Unless you need resolution over 1280x1024 or need a ridiculously large
>viewing angle, there are LCDs that serve perfectly for both graphics editing
>and games. For example, the NEC 2010X is totally suitable to both
>applications.
>
> DS
>

What if I need to run games at lower resolutions, what if I like my
monitor to actually be capable of showing subtle gradation in tones,
what if I prefer superior colour accuracy?
 
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

In comp.sys.ibm.pc.hardware.chips Praxiteles Democritus <no@email.here> wrote:
> The usual stupid assumptions posted by clueless tossers.

That's more than a little impolite, and counterproductive
if you actually wanted to convince someone.

> There are a few reasons why CRT is superior to LCD in image quality.

There are, but you leave them unstated, and one gets
the impression that you don't know them. I usually
equate impoliteness with ignorance.

As I understand it, many gamers still prefer CRT over LCD:

1) CRT phosphors have lower presistance than LCDs, producing
less afterimage during motion ("ghosting")

2) LCD pixels are extremely sharp. This is great for text,
but unpleasant for images. The slight blur of CRTs mimics
natural vision and avoids hyperpixelation.

There has been considerable improvement in (1), but (2)
still operates. For a simple demonstration, try watching
a DVD on an LCD vs CRT.

-- Robert
 
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Mon, 29 Aug 2005 16:28:07 -0700, "David Schwartz" <davids@webmaster.com>
wrote:

>
>"George Macdonald" <fammacd=!SPAM^nothanks@tellurian.com> wrote in message
>news:nc37h19r7k9t14a8mpm60227i6bjljj81j@4ax.com...
>
>> Maybe he's into photography and games, in which case there is no single
>> LCD
>> which will do the job for him. With LCDs, you choose your technology, be
>> it TN/film, xVA or IPS for the job it does with one of your tasks and live
>> with any/all compromises elsewhere. Every time it looks like the latest
>> twist or tweak is going to be the universal fix, like S-IPS, there's
>> always
>> something which annoys with it.
>
> Unless you need resolution over 1280x1024 or need a ridiculously large
>viewing angle, there are LCDs that serve perfectly for both graphics editing
>and games. For example, the NEC 2010X is totally suitable to both
>applications.

Resolution has nothing to do with it - in this case it's color rendition
vs. response time and I said photography, not graphics editing. I'm afraid
you'll need more than that, a model number, to convince me and especially
that NEC doesn't have a listing for this thing in their current product
line, nor do they list important specs for any of their monitors, nor tell
which technology they are using for any given screen.

There are also other problems with the very recent (expensive) LCDs which
claim to be suitable for photography and have near-sufficiently good
response for games, such as full-motion video artifacts. There's a reason
that Eizo-Nanao charges an arm & leg for a screen with ~40ms reponse time.

--
Rgds, George Macdonald
 
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On 29 Aug 2005 16:58:42 -0700, "Robert Myers" <rbmyersusa@gmail.com> wrote:

>
>George Macdonald wrote:
>> On 29 Aug 2005 07:33:07 -0700, "Robert Myers" <rbmyersusa@gmail.com> wrote:
>>
>> >George Macdonald wrote:
>> >> On 27 Aug 2005 06:36:43 -0700, "Robert Myers" <rbmyersusa@gmail.com> wrote:
>> >>
>> >> >George Macdonald wrote:
>> >> >> On 26 Aug 2005 07:37:06 -0700, "Robert Myers" <rbmyersusa@gmail.com> wrote:
>> >> >>
>>
>> >> >No matter what power management trickery does for you most of the time,
>> >> >you've got to be able to cool the thing when it's operating at peak
>> >> >performance.
>> >>
>> >> Well we know that Intel stubbed its toes there at 4GHz and while the end of
>> >> scaling seems to be accepted as imminent, it's not clear how far other mfrs
>> >> can go, nor in what time scale. What I'm talking about is also more than
>> >> what we normally think of as power management - more like distributed
>> >> dynamic adaptive clocks - there may be a better term for that. 100% load
>> >> is difficult to categorize there and of course "clock rate" becomes
>> >> meaningless as a performance indicator.
>> >>
>> >> AMD has said that it intends to continue to push clock speeds on single
>> >> core CPUs and its current offerings do not suffer anywhere near the same
>> >> heat stress as Intel's even at "100% load"; if AMD can get to 4GHz, and
>> >> maybe a bit beyond with 65nm, they are quite well positioned. All I'm
>> >> saying is that I'm not ready to swallow all of Intel's latest market-speak
>> >> on power/watt as a new metric for measuring CPU effectiveness. They
>> >> certainly tried to get too far up the slippery slope too quickly - it still
>> >> remains to be seen where the real limits are and which technology makes a
>> >> difference.
>> >>
>> >Let's not get into another Intel/AMD round. As it stands now, Intel is
>> >likely to put its efforts at pushing single thread performance into
>> >Itanium. Who knows how long that emphasis will last.
>>
>> It was an honest and *correct* comment on Intel's technology choices - no
>> need to have any "round" of anything... and certainly not about Itanium.
>>
>Translation: Intel isn't likely to want to play. That may have no
>bearing on AMD's decision-making whatsoever, but, if AMD wants to go
>after x86 users with need for single-thread performance, I suspect they
>will have the market all to themselves. The gamers who have
>historically carried users hungry for single-threaded performance will
>all have moved to multi-core machines because that's where they'll be
>getting the best performance because all of the software will have been
>rewritten for mulitple cores. IBM will stay in the game because IBM
>wants to keep Power ahead of Itanium on SpecFP, and the x86 chips
>you'll be looking to buy, if they're available, will be priced like
>Power5, or whatever it is by then. You know, that monopoly thing.

Hmm, and you you said you didn't want to get into another "Intel/AMD"
round... and yet, there you go again. I was only stating a documneted
acknowledged fact - your prognostications are not relevant.

The game makers have already stated that they don't expect to get much out
of multi-core - it looks to me single high-speed core is what is needed
there for a (long) while yet. Hell, dual CPUs have been available for long
enough and they have not tweaked any gaming interest.

>> >I don't think the market is going to be there to pay for the kind of
>> >workhorse you say you need. Power consumption is a huge economic
>> >consideration in HPC: as it stands now, it doesn't pay to run clusters
>> >more than about 3 years because it is more expensive to pay for the
>> >power to continue running them than it is to pay for replacements that
>> >save power.
>>
>> I don't need a super-computer - I want the fastest PC I can get for a
>> price-point which is usually a notch back from the very highest
>> clock-speed, not some compromised thing which fits a marketing strategy for
>> peoples' living rooms.
>>
>
>My explanation as to why the other usual customers for single-threaded
>performance won't be there in large numbers, either.

So far I haven't seen an "explanation" for anything here... other than a
blind willingness to follow the latest Intel marketing angle.

>> >> A different programming paradigm/style is not going to help them -
>> >> expectation of success is not obviously better than that of new
>> >> semiconductor tweaks, or even technology, which allows another 100x speed
>> >> ramp over 10 years or so coming along. When I hear talk of new compiler
>> >> technology to assist here, I'm naturally skeptical, based on past
>> >> experiences.
>> >
>> >Well sure. The compiler first has to reverse engineer the control and
>> >dataflow graph that's been obscured by the programmer and the
>> >sequential language with bolted-on parallelism that was used. If you
>> >could identify the critical path, you'd know what to do, but, even for
>> >very repetitive calculations, the critical path that is optimized is at
>> >best a guess.
>>
>> It's not static - compilers don't have the right info for the job... and
>> compiler-compilers won't do it either.
>>
>That's why you do profiling.

It makes me wonder sometimes when you spout some buzzword like that as
though it is known to work well for all general purpose code working on all
possible data sets.<shrug> People who use compilers know this.

--
Rgds, George Macdonald
 
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Tue, 30 Aug 2005 16:00:54 GMT, Praxiteles Democritus <no@email.here>
wrote:

>On Mon, 29 Aug 2005 19:15:35 -0400, George Macdonald
><fammacd=!SPAM^nothanks@tellurian.com> wrote:
>
>
>>Maybe he's into photography and games,
>
>Correct, I'm into both and crt is superior in both instances.

Maybe you can explain to him how his NEC 2010X works in games🙂 - the only
mention I've seen of reponse time quotes "60ms typical".

--
Rgds, George Macdonald
 
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

"George Macdonald" <fammacd=!SPAM^nothanks@tellurian.com> wrote in message
news:06p9h1dlp0cuii0vmtatkuieno09gui76m@4ax.com...

> On Tue, 30 Aug 2005 16:00:54 GMT, Praxiteles Democritus <no@email.here>
> wrote:

>>On Mon, 29 Aug 2005 19:15:35 -0400, George Macdonald
>><fammacd=!SPAM^nothanks@tellurian.com> wrote:

>>>Maybe he's into photography and games,
>>
>>Correct, I'm into both and crt is superior in both instances.

> Maybe you can explain to him how his NEC 2010X works in games🙂 - the
> only
> mention I've seen of reponse time quotes "60ms typical".

It works really, really well. It totally blows away the Compaq P920
sitting next to it.

DS
 
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Tue, 30 Aug 2005 12:52:36 GMT, CJT <abujlehc@prodigy.net> wrote:

>Trent wrote:
>> Tsk, tsk. Take your spankings over PC power consumption like a man,
>> puddles. For your sake, I hope that you have a better grasp of your crank
>> than you do of computers.
>
>Hey! Intel and Apple agree that power consumption of PCs is a problem,
>so I'm in good company.

You're in good company for all the wrong reasons. Intel and Apple
agree that the power density of processors is getting excessive, it's
just getting too hard to get the power into the chips and getting the
power out of the chips. They also agree that power consumption has to
be kept down on laptops where battery life matters.

Neither of them are making any sort of arguments about processor power
consumption having a noticeable effect on the electrical grid because
such an argument is nonsense.

-------------
Tony Hill
hilla <underscore> 20 <at> yahoo <dot> ca
 
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Tue, 30 Aug 2005 03:26:02 +0000, CJT wrote:

> keith wrote:
>
>> On Mon, 29 Aug 2005 18:13:19 -0700, David Schwartz wrote:
>>
>>
>>>"CJT" <abujlehc@prodigy.net> wrote in message
>>>news:4313A54A.8020300@prodigy.net...
>>>
>>>
>>>>David Schwartz wrote:
>>>
>>>>> Unless you need resolution over 1280x1024 or need a ridiculously
>>>>>large viewing angle, there are LCDs that serve perfectly for both
>>>>>graphics editing and games. For example, the NEC 2010X is totally
>>>>>suitable to both applications.
>>>
>>>>1280x1024 isn't exactly hires any more.
>>>
>>> There are very few games that support resolutions above that. For normal
>>>desktop work, 1280x1024 is more than adequate. Personally, I prefer to have
>>>two LCD monitors, each 1280x1024, using the second one only when
>>>circumstances require it.
>>
>>
>> As others here will attest, I've been using a 3200x1600 desktop at
>> work for almost five years. One display is the laptop's LCD, the other is
>> a 20" monitor. 1280x1024 is *NOT* adequite (though I live with two
>> 19" CRTs at this resolution, each, here at home).
>>
>>
>>> What percentage of PC computer users do you think have a resolution
>>> over 1280x1024?
>>
>>
>> What percentage have tried it? What percentage have ever gone back?
>> Sheesh, I still see people with 1024x768 on 20" monitors at 60Hz! Is that
>> what we should all aspire to? ...the least common denominator?
>>
> Yeah, you da man ... NOT!

I see you've now admitted that you've been talking out your ass. Thank
you.

--
Keith
 
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Tue, 30 Aug 2005 20:52:54 -0400, Tony Hill wrote:

> On Tue, 30 Aug 2005 12:52:36 GMT, CJT <abujlehc@prodigy.net> wrote:
>
>>Trent wrote:
>>> Tsk, tsk. Take your spankings over PC power consumption like a man,
>>> puddles. For your sake, I hope that you have a better grasp of your crank
>>> than you do of computers.
>>
>>Hey! Intel and Apple agree that power consumption of PCs is a problem,
>>so I'm in good company.
>
> You're in good company for all the wrong reasons. Intel and Apple
> agree that the power density of processors is getting excessive, it's
> just getting too hard to get the power into the chips and getting the
> power out of the chips. They also agree that power consumption has to
> be kept down on laptops where battery life matters.
>
> Neither of them are making any sort of arguments about processor power
> consumption having a noticeable effect on the electrical grid because
> such an argument is nonsense.

Exactly. Intel isn't trying to save the "national" power grid, rather
their marketing restriced power supply (and sink). They can't compete in
the "unlimited class", so are redefining the market. No doubt that
mobiles are quite important and power density is a problem, bu:774ntel's spin is dizzying. ...again. ;-)

--
Keith
 
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Tue, 30 Aug 2005 15:46:55 +0000, Felger Carbon wrote:

> "chrisv" <chrisv@nospam.invalid> wrote in message
> news:b3q8h15k8nkrdscflj165jpkfe90mqf2c3@4ax.com...
>> David Schwartz wrote:
>>
>> > Unless you need resolution over 1280x1024 or need a
> ridiculously large
>> >viewing angle, there are LCDs that serve perfectly for both
> graphics editing
>> >and games. For example, the NEC 2010X is totally suitable to both
>> >applications.
>>
>> I just don't like the fact that they are optimized for one
> resolution.
>> I like to be able to change resolutions without suffering large
>> display-quality degradation.
>
> Chris, I have a 19" LCD with native 1280x1024 resolution. At Keith's
> urging, I have on three occasions made a valiant effort to switch my
> desktop viewing to that resolution. I mean, I tried hard, adjusting
> icon sizes, font sizes, etc. On each occasion, after wasting the
> better part of a day I've had to switch back to 1024x768, which is
> _not_ native resolution but is the only resolution I'm able to put up
> with. Different people have different preferences. Keith thinks I'm
> a neanderthal. He's probably right. ;-)

Did neanderthals have glasses? Think maybe? ;-)

No neanerthal here! ...35 in a few weeks, no glasses, even with 1600x1200
on a 15" laptop. ;-)

--
Keith
 
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

Tony Hill wrote:

> On Tue, 30 Aug 2005 12:52:36 GMT, CJT <abujlehc@prodigy.net> wrote:
>
>
>>Trent wrote:
>>
>>>Tsk, tsk. Take your spankings over PC power consumption like a man,
>>>puddles. For your sake, I hope that you have a better grasp of your crank
>>>than you do of computers.
>>
>>Hey! Intel and Apple agree that power consumption of PCs is a problem,
>>so I'm in good company.
>
>
> You're in good company for all the wrong reasons. Intel and Apple
> agree that the power density of processors is getting excessive, it's
> just getting too hard to get the power into the chips and getting the
> power out of the chips. They also agree that power consumption has to
> be kept down on laptops where battery life matters.
>
> Neither of them are making any sort of arguments about processor power
> consumption having a noticeable effect on the electrical grid because
> such an argument is nonsense.
>
> -------------
> Tony Hill
> hilla <underscore> 20 <at> yahoo <dot> ca

I guess you don't remember the power shortages in California a few years
back, which were credited in part to the rapid increase in computer use.

--
The e-mail address in our reply-to line is reversed in an attempt to
minimize spam. Our true address is of the form che...@prodigy.net.
 
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Wed, 31 Aug 2005 01:14:25 +0000, CJT wrote:

> Tony Hill wrote:
>
>> On Tue, 30 Aug 2005 12:52:36 GMT, CJT <abujlehc@prodigy.net> wrote:
>>
>>
>>>Trent wrote:
>>>
>>>>Tsk, tsk. Take your spankings over PC power consumption like a man,
>>>>puddles. For your sake, I hope that you have a better grasp of your crank
>>>>than you do of computers.
>>>
>>>Hey! Intel and Apple agree that power consumption of PCs is a problem,
>>>so I'm in good company.
>>
>>
>> You're in good company for all the wrong reasons. Intel and Apple
>> agree that the power density of processors is getting excessive, it's
>> just getting too hard to get the power into the chips and getting the
>> power out of the chips. They also agree that power consumption has to
>> be kept down on laptops where battery life matters.
>>
>> Neither of them are making any sort of arguments about processor power
>> consumption having a noticeable effect on the electrical grid because
>> such an argument is nonsense.
>>
>> -------------
>> Tony Hill
>> hilla <underscore> 20 <at> yahoo <dot> ca
>
> I guess you don't remember the power shortages in California a few years
> back, which were credited in part to the rapid increase in computer use.

You still don't get it. Intel doesn't care about the power grid. If the
idiots in CA would have fixed their "deregulation" issues the shortages
would never have existed. PCs had *nothing* to do with the problem and
Intel's corporate muission is *not* to fix "it".

What a maroon!

--

Keith
 
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

keith wrote:

> On Tue, 30 Aug 2005 03:26:02 +0000, CJT wrote:
>
>
>>keith wrote:
>>
>>
>>>On Mon, 29 Aug 2005 18:13:19 -0700, David Schwartz wrote:
>>>
>>>
>>>
>>>>"CJT" <abujlehc@prodigy.net> wrote in message
>>>>news:4313A54A.8020300@prodigy.net...
>>>>
>>>>
>>>>
>>>>>David Schwartz wrote:
>>>>
>>>>>> Unless you need resolution over 1280x1024 or need a ridiculously
>>>>>>large viewing angle, there are LCDs that serve perfectly for both
>>>>>>graphics editing and games. For example, the NEC 2010X is totally
>>>>>>suitable to both applications.
>>>>
>>>>>1280x1024 isn't exactly hires any more.
>>>>
>>>> There are very few games that support resolutions above that. For normal
>>>>desktop work, 1280x1024 is more than adequate. Personally, I prefer to have
>>>>two LCD monitors, each 1280x1024, using the second one only when
>>>>circumstances require it.
>>>
>>>
>>>As others here will attest, I've been using a 3200x1600 desktop at
>>>work for almost five years. One display is the laptop's LCD, the other is
>>>a 20" monitor. 1280x1024 is *NOT* adequite (though I live with two
>>>19" CRTs at this resolution, each, here at home).
>>>
>>>
>>>
>>>> What percentage of PC computer users do you think have a resolution
>>>> over 1280x1024?
>>>
>>>
>>>What percentage have tried it? What percentage have ever gone back?
>>>Sheesh, I still see people with 1024x768 on 20" monitors at 60Hz! Is that
>>>what we should all aspire to? ...the least common denominator?
>>>
>>
>>Yeah, you da man ... NOT!
>
>
> I see you've now admitted that you've been talking out your ass. Thank
> you.
>
You see nothing.

--
The e-mail address in our reply-to line is reversed in an attempt to
minimize spam. Our true address is of the form che...@prodigy.net.
 
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

George Macdonald wrote:
> On 29 Aug 2005 16:58:42 -0700, "Robert Myers" <rbmyersusa@gmail.com> wrote:
>
> >
> >George Macdonald wrote:
> >> On 29 Aug 2005 07:33:07 -0700, "Robert Myers" <rbmyersusa@gmail.com> wrote:
> >>
> >> >George Macdonald wrote:
> >> >> On 27 Aug 2005 06:36:43 -0700, "Robert Myers" <rbmyersusa@gmail.com> wrote:
> >> >>
> >> >> >George Macdonald wrote:
> >> >> >> On 26 Aug 2005 07:37:06 -0700, "Robert Myers" <rbmyersusa@gmail.com> wrote:
> >> >> >>
> >>
> >> >> >No matter what power management trickery does for you most of the time,
> >> >> >you've got to be able to cool the thing when it's operating at peak
> >> >> >performance.
> >> >>
> >> >> Well we know that Intel stubbed its toes there at 4GHz and while the end of
> >> >> scaling seems to be accepted as imminent, it's not clear how far other mfrs
> >> >> can go, nor in what time scale. What I'm talking about is also more than
> >> >> what we normally think of as power management - more like distributed
> >> >> dynamic adaptive clocks - there may be a better term for that. 100% load
> >> >> is difficult to categorize there and of course "clock rate" becomes
> >> >> meaningless as a performance indicator.
> >> >>
> >> >> AMD has said that it intends to continue to push clock speeds on single
> >> >> core CPUs and its current offerings do not suffer anywhere near the same
> >> >> heat stress as Intel's even at "100% load"; if AMD can get to 4GHz, and
> >> >> maybe a bit beyond with 65nm, they are quite well positioned. All I'm
> >> >> saying is that I'm not ready to swallow all of Intel's latest market-speak
> >> >> on power/watt as a new metric for measuring CPU effectiveness. They
> >> >> certainly tried to get too far up the slippery slope too quickly - it still
> >> >> remains to be seen where the real limits are and which technology makes a
> >> >> difference.
> >> >>
> >> >Let's not get into another Intel/AMD round. As it stands now, Intel is
> >> >likely to put its efforts at pushing single thread performance into
> >> >Itanium. Who knows how long that emphasis will last.
> >>
> >> It was an honest and *correct* comment on Intel's technology choices - no
> >> need to have any "round" of anything... and certainly not about Itanium.
> >>
> >Translation: Intel isn't likely to want to play. That may have no
> >bearing on AMD's decision-making whatsoever, but, if AMD wants to go
> >after x86 users with need for single-thread performance, I suspect they
> >will have the market all to themselves. The gamers who have
> >historically carried users hungry for single-threaded performance will
> >all have moved to multi-core machines because that's where they'll be
> >getting the best performance because all of the software will have been
> >rewritten for mulitple cores. IBM will stay in the game because IBM
> >wants to keep Power ahead of Itanium on SpecFP, and the x86 chips
> >you'll be looking to buy, if they're available, will be priced like
> >Power5, or whatever it is by then. You know, that monopoly thing.
>
> Hmm, and you you said you didn't want to get into another "Intel/AMD"
> round... and yet, there you go again. I was only stating a documneted
> acknowledged fact - your prognostications are not relevant.
>
> The game makers have already stated that they don't expect to get much out
> of multi-core - it looks to me single high-speed core is what is needed
> there for a (long) while yet. Hell, dual CPUs have been available for long
> enough and they have not tweaked any gaming interest.
>
There are several different issues tangled up here:

1. How much further single-thread performance can be pushed.

2. How much those chips will cost.

3. Whether single-thread chips will dominate gaming.

4. How much of what Intel is doing is pure market-speak.

5. The purported advantage AMD has with respect to "heat stress."

Taking the issues in inverse order:

5. The "heat stress" problems Intel has are the result of having to run
NetBurst at a higher clock to get comparable performance. The P6 core
derivatives have plenty of headroom.

4. For many applications, performance per watt is the figure of merit
of greatest interest because that will determine how much muscle can be
packed into a given space. For those who need single-thread
performance, it isn't a figure of merit that's of interest. If you
really need single-thread performance, there will always be options, at
a price

http://www.alienware.com/configurator_pages/Aurora_alx.aspx?SysCode=PC-AURORALX-SLI-D

IOW, if it's *that* important to you right now, all you have to do is
to get out your checkbook.

3. It may take a while, but there really isn't anywhere else to go.
The idea of having a separate, specialized physics engine is kind of
silly because there's no reason why the physics can't be done by
another CPU core (the solution I really like, actually, is a design
like Cell, which seems to get the best of both worlds). You're going
to accuse me of shilling for Intel, but you (or someone else reading
this) might be interested in

http://www.intel.com/cd/ids/developer/asmo-na/eng/strategy/multicore/index.htm

Scroll down past the marketing bs to "Application Development and
Performance Resources."

2. Chips with high single-thread performance will continue to be
available because the market will be limited, and the easiest way to
achieve high performance is binning. Intel will have to have at least
one chip so that they can publish SPEC benchmarks, and I suppose a few
dozen will have to be on offer somewhere to keep it legit.

1. I have no idea how much further single thread performance can be
pushed.

<snip>

>
> >> >> A different programming paradigm/style is not going to help them -
> >> >> expectation of success is not obviously better than that of new
> >> >> semiconductor tweaks, or even technology, which allows another 100x speed
> >> >> ramp over 10 years or so coming along. When I hear talk of new compiler
> >> >> technology to assist here, I'm naturally skeptical, based on past
> >> >> experiences.
> >> >
> >> >Well sure. The compiler first has to reverse engineer the control and
> >> >dataflow graph that's been obscured by the programmer and the
> >> >sequential language with bolted-on parallelism that was used. If you
> >> >could identify the critical path, you'd know what to do, but, even for
> >> >very repetitive calculations, the critical path that is optimized is at
> >> >best a guess.
> >>
> >> It's not static - compilers don't have the right info for the job... and
> >> compiler-compilers won't do it either.
> >>
> >That's why you do profiling.
>
> It makes me wonder sometimes when you spout some buzzword like that as
> though it is known to work well for all general purpose code working on all
> possible data sets.<shrug> People who use compilers know this.
>
Would you have been happier if I had said "feedback-directed
optimization?" The compiler can't accurately infer dependencies from
software written in, say, c. If those ambiguities didn't exist, there
would still be the problem of determining the hot paths in the
software. Given an accurate control and data flow graph, it's pretty
easy to discover the hot paths, and the main reason feedback directed
optimization doesn't always work well is that there is no accurate
control and data flow graph to begin with. Even the most unlikely of
software, like Microsoft Word, turns out to be incredibly repetitive in
the paths taken through the software.

RM
 
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

Robert Redelmeier wrote:

>
> 2) LCD pixels are extremely sharp. This is great for text,
> but unpleasant for images. The slight blur of CRTs mimics
> natural vision and avoids hyperpixelation.
>

There is no disputing matters of taste. I very much prefer the
appearance of text on a CRT over the appearance of text on an LCD
array.

RM
 
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

George Macdonald wrote:

[Snip]
> There are also other problems with the very recent (expensive) LCDs which
> claim to be suitable for photography and have near-sufficiently good
> response for games, such as full-motion video artifacts. There's a reason
> that Eizo-Nanao charges an arm & leg for a screen with ~40ms reponse time.

You obviously haven't tried out a reasonably new LCD.

My gf has a 19" Samsung with an 12 ms response time. Cost about
$420 (Canadian) a few weeks ago. No trace of ghosting when
watching videos (me) or playing games (her). Settling for an
older Samsung with a 25 ms response time would have saved $40 -
and you have to look hard these days to find a vendor selling LCD
monitors with a response time slower than that.

Current LCD monitors being pitched at gamers have 5 or 8 ms
response times and you pay about a 15% premium over a 12 ms monitor.
 
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

"Rob Stow" <rob.stow@shaw.ca> wrote in message
news:3TiRe.345420$s54.319602@pd7tw2no...

> Current LCD monitors being pitched at gamers have 5 or 8 ms response times
> and you pay about a 15% premium over a 12 ms monitor.

Since your frame rate is about 60 frames per second, it's hard to
imagine a response time better than 10mS makes any noticeable difference. I
would imagine it would look a bit better to blur one frame at least slightly
into the next than to shift instantaneously 60 times per second.

In any event, I haven't seen noticeable response time issues on any of
the LCD monitors I've seen manufactured in the past 2 years. That would be
at least 15 different models, low end to high end, 15 inch to 20 inch.

DS
 
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

CJT wrote:
> Tony Hill wrote:
>
>> On Tue, 30 Aug 2005 12:52:36 GMT, CJT <abujlehc@prodigy.net> wrote:
>>
>>
>>> Trent wrote:
>>>
>>>> Tsk, tsk. Take your spankings over PC power consumption like a man,
>>>> puddles. For your sake, I hope that you have a better grasp of your
>>>> crank
>>>> than you do of computers.
>>>
>>>
>>> Hey! Intel and Apple agree that power consumption of PCs is a problem,
>>> so I'm in good company.
>>
>>
>>
>> You're in good company for all the wrong reasons. Intel and Apple
>> agree that the power density of processors is getting excessive, it's
>> just getting too hard to get the power into the chips and getting the
>> power out of the chips. They also agree that power consumption has to
>> be kept down on laptops where battery life matters.
>>
>> Neither of them are making any sort of arguments about processor power
>> consumption having a noticeable effect on the electrical grid because
>> such an argument is nonsense.
>>
>> -------------
>> Tony Hill
>> hilla <underscore> 20 <at> yahoo <dot> ca
>
>
> I guess you don't remember the power shortages in California a few years
> back, which were credited in part to the rapid increase in computer use.
>
I guess you don't remember that they were found to be caused by
deliberate manipulation to drive up the prices...

--
bill davidsen
SBC/Prodigy Yorktown Heights NY data center
http://newsgroups.news.prodigy.com
 
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

Felger Carbon wrote:

>"chrisv" <chrisv@nospam.invalid> wrote:
>>
>> I just don't like the fact that they are optimized for oneresolution.
>> I like to be able to change resolutions without suffering large
>> display-quality degradation.
>
>Chris, I have a 19" LCD with native 1280x1024 resolution. At Keith's
>urging, I have on three occasions made a valiant effort to switch my
>desktop viewing to that resolution. I mean, I tried hard, adjusting
>icon sizes, font sizes, etc. On each occasion, after wasting the
>better part of a day I've had to switch back to 1024x768, which is
>_not_ native resolution but is the only resolution I'm able to put up
>with. Different people have different preferences. Keith thinks I'm
>a neanderthal. He's probably right. ;-)

Yeah, for old folk's, the CRT's "resolution flexability" is definately
nice. Run a 21" CRT at 1024x768 to get nice large letters. 8)
 
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

"chrisv" <chrisv@nospam.invalid> wrote in message
news:j27ch1lp67oe90rl25dlavlaisl6tahblg@4ax.com...

> Yeah, for old folk's, the CRT's "resolution flexability" is definately
> nice. Run a 21" CRT at 1024x768 to get nice large letters. 8)

I use a 20" LCD at 1280x1024 (its native resolution), and one of the
things I like most about it is that I can sit further from it than I could
with a smaller screen or higher resolution. (I previously had an 18" LCD
with the same resolution.) That really reduces eye strain.

DS
 
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Tue, 30 Aug 2005 19:04:24 -0400, George Macdonald
<fammacd=!SPAM^nothanks@tellurian.com> wrote:


>Maybe you can explain to him how his NEC 2010X works in games🙂 - the only
>mention I've seen of reponse time quotes "60ms typical".

Look, I've played games on a couple of LCD's and when you play games
with lots of blacks a good CRT is clearly superior. Full Stop.
 
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Wed, 31 Aug 2005, CJT wrote:

> I guess you don't remember the power shortages in California a few years
> back, which were credited in part to the rapid increase in computer use.

The main cause for the power problems turned out to be Enron's fraud.

--
Yves Bellefeuille
<yan@storm.ca>
 
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

In comp.sys.ibm.pc.hardware.chips Robert Myers <rbmyersusa@gmail.com> wrote:
> There is no disputing matters of taste. I very much prefer
> the appearance of text on a CRT over the appearance of text
> on an LCD array.

Entirely true. "De gustibus non est disputandam [tametsi peccatum
est]" There's no disputing taste [even when it's wrong] :)

-- Robert
 
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Tue, 30 Aug 2005 16:47:16 -0700, "David Schwartz" <davids@webmaster.com>
wrote:

>
>"George Macdonald" <fammacd=!SPAM^nothanks@tellurian.com> wrote in message
>news:06p9h1dlp0cuii0vmtatkuieno09gui76m@4ax.com...
>
>> On Tue, 30 Aug 2005 16:00:54 GMT, Praxiteles Democritus <no@email.here>
>> wrote:
>
>>>On Mon, 29 Aug 2005 19:15:35 -0400, George Macdonald
>>><fammacd=!SPAM^nothanks@tellurian.com> wrote:
>
>>>>Maybe he's into photography and games,
>>>
>>>Correct, I'm into both and crt is superior in both instances.
>
>> Maybe you can explain to him how his NEC 2010X works in games🙂 - the
>> only
>> mention I've seen of reponse time quotes "60ms typical".
>
> It works really, really well. It totally blows away the Compaq P920
>sitting next to it.

Tell that to the people who play 1st person games with their infamous
gray/black shadows. 60ms isn't even close.

You know monitor mfrs have this "game" they play with specs: they hide the
underlying LCD technology of the panel behind some brand name, like
Xtraview in this case. They do this partly because they can then change
that technology by switching to a different panel without changing product
names/models. According to NEC's meagre technical docs, Xtraview == IPS
but there are also docs around the Web which declare the 2010X to be a MVA
LCD panel.

Many gamers are now well informed on the 3 basic technologies, TN+film,
(MVA, PVA) IPS, and their variants like pMVA and S-IPS. They know which is
likely to perform well under games and how much compromise they'll get if
they have dual purpose needs, like the aforementioned photography + gaming,
and try to adopt a less than optimal technology for one or the other.

The bottom line is that TN+film in its latest form is the only one of the
three basic technologies which is suitable for modern game play but it
stinks for photography; S-IPS is the closest to satisfying the dual needs
of photography + games *but* it's not quite fast enough for the fastest
games with dark shadow elements and it has other issues, such as "sparkles"
on full-motion video... apparently due to the "overdrive" used to speed up
shades of gray switching.

Whatever technology your 2010X uses, it's been around for >4 years and it's
just not up to modern game play. If it satisfies your needs, that's fine
but please don't try to suggest it is up to a CRT for multiple purpose
usage. Personally I have a PVA screen (Samsung obviously) and I'm happy,
delighted even, with its color rendition - its speed is OK for what I do
but I'm not a gamer and I'd never suggest that a gamer use it.

--
Rgds, George Macdonald
 
Archived from groups: comp.sys.ibm.pc.hardware.chips,comp.sys.intel (More info?)

On Wed, 31 Aug 2005 20:25:47 -0400, Yves Bellefeuille wrote:

> On Wed, 31 Aug 2005, CJT wrote:
>
>> I guess you don't remember the power shortages in California a few years
>> back, which were credited in part to the rapid increase in computer use.
>
> The main cause for the power problems turned out to be Enron's fraud.

No, the main cause for both was government incompetence.

--
Keith