Intel Phases Out 10 More Mobile Sandy Bridge Processors

Status
Not open for further replies.

teh_chem

Honorable
Jun 20, 2012
902
0
11,010
Incremental upgrades FTW!

I could almost understand phasing in the new CPUs if every IB CPU had the HD4K GPU chipset. But with only offering the HD2500 on a lot of the low-to-mid range CPUs (where the HD4K would actually make sense), they're not really giving the users a lot of value, or much of a reason to upgrade.
 
[citation][nom]teh_chem[/nom]Incremental upgrades FTW!I could almost understand phasing in the new CPUs if every IB CPU had the HD4K GPU chipset. But with only offering the HD2500 on a lot of the low-to-mid range CPUs (where the HD4K would actually make sense), they're not really giving the users a lot of value, or much of a reason to upgrade.[/citation]

All of the CPUs that replace these ones would have HD 4000. Besides, Intel is more into promoting Nvidia graphics cards on their low end laptops and their low end desktop CPUs wouldn't have a whole lot of use for HD 4000 when anyone whom uses them for gaming and such would get a discrete card at the least (even i3s are far too high end to be wasted on anything weaker than a Radeon 6750; leave the lower end graphics for lower end CPUs that also cost less) and anyone who doesn't go for gaming should be more than satisfied with the HD 2500.
 

bustapr

Distinguished
Jan 23, 2009
1,613
0
19,780
an i7 laptop is more likely to have nvidia graphics than to use the on-die hd4000/hd2500. if youre getting an i7 laptop without discrete graphics, youre probably doing something wrong.
 
[citation][nom]bustapr[/nom]an i7 laptop is more likely to have nvidia graphics than to use the on-die hd4000/hd2500. if youre getting an i7 laptop without discrete graphics, youre probably doing something wrong.[/citation]

... An i7 laptop is far more likely to be someone who uses say QuickSync, someone who wants the faster GPU for better Lucid optimization of discrete graphics cards, someone who simply wants better CPU performance, et cetera. HD 4000's performance advantages can help discrete cards with Lucid, so they can make a difference even if there is a discrete card! If you have some imagination behind your opinions and do some more research to understand the situation better, you'd probably see this.
 

teh_chem

Honorable
Jun 20, 2012
902
0
11,010
[citation][nom]blazorthon[/nom]All of the CPUs that replace these ones would have HD 4000. Besides, Intel is more into promoting Nvidia graphics cards on their low end laptops and their low end desktop CPUs wouldn't have a whole lot of use for HD 4000 when anyone whom uses them for gaming and such would get a discrete card at the least (even i3s are far too high end to be wasted on anything weaker than a Radeon 6750; leave the lower end graphics for lower end CPUs that also cost less) and anyone who doesn't go for gaming should be more than satisfied with the HD 2500.[/citation]
That's a good point, I forgot these were all i7's. But it won't change the fact that eventually the low and mid end CPUs will roll out--gaming graphical performance is only the tip of the iceberg. For video conversion using QuickSync, the HD4k is significantly faster than the 2500. When you have a "slower" processor (not the best way to put it, but essentially the mid-to-low range CPUs that won't get HD4k), you'd think they'd want to supplement that lower CPU power with a more advanced graphical chipset capable of more GPGPU tasks.
 

A Bad Day

Distinguished
Nov 25, 2011
2,256
0
19,790
[citation][nom]blazorthon[/nom]... An i7 laptop is far more likely to be someone who uses say QuickSync, someone who wants the faster GPU for better Lucid optimization of discrete graphics cards, someone who simply wants better CPU performance, et cetera. HD 4000's performance advantages can help discrete cards with Lucid, so they can make a difference even if there is a discrete card! If you have some imagination behind your opinions and do some more research to understand the situation better, you'd probably see this.[/citation]

Unless if you need Hyperthreading, the i7s are barely faster than the i5s at a higher price.
 
[citation][nom]A Bad Day[/nom]Unless if you need Hyperthreading, the i7s are barely faster than the i5s at a higher price.[/citation]

It's a very different story in the mobile market, especially with the quad core i7s versus the dual-core i5s rather than the dual-core i7s versus the dual-core i5s (there aren't quad-core i5s for new laptops).
 
[citation][nom]teh_chem[/nom]That's a good point, I forgot these were all i7's. But it won't change the fact that eventually the low and mid end CPUs will roll out--gaming graphical performance is only the tip of the iceberg. For video conversion using QuickSync, the HD4k is significantly faster than the 2500. When you have a "slower" processor (not the best way to put it, but essentially the mid-to-low range CPUs that won't get HD4k), you'd think they'd want to supplement that lower CPU power with a more advanced graphical chipset capable of more GPGPU tasks.[/citation]

Whether or not future IGPs from Intel will be enough for modern entry-level gaming doesn't change the fact that the HD 4000 is kinda poor for gaming, admittedly it is playable when the drivers don't fail. It would be like playing a Radeon 6450, except with far inferior drivers. If it is fail at gaming, then it doesn't make sense to put it in situations where its poor drivers would give bad reactions from gamers whom try it, thus hurting Intel's reputation.

It isn't ready for that yet, so Intel didn't put it in that situation. As much as I'd rather have HD 4000 on an i3 if I buy one, at least there are good reasons for it not being there.

The argument of supplementing CPUs with inferior integer performance and FPU performance with a superior or similar GPU than their bigger brothers is also somewhat flawed. It would mean that, at best, the i7s would have a substantial GPU frequency advantage and Intel might lock the frequency in the lower end models so low that it would only be somewhat better than an HD 2500. Intel would also only do this with some models and would charge a premium over the models that have the same CPU performance with an HD 2500. Intel would need some way to make up for chips that have faulty graphics and that would be it.

So, they i7s would still have a substantial GPU performance advantage even if Intel gave the HD 4000 to lower end models. Keep in mind that Intel's HD graphics of the same model still have frequencies that can be literally all over the place with different CPUs.
 

A Bad Day

Distinguished
Nov 25, 2011
2,256
0
19,790
[citation][nom]blazorthon[/nom]It's a very different story in the mobile market, especially with the quad core i7s versus the dual-core i5s rather than the dual-core i7s versus the dual-core i5s (there aren't quad-core i5s for new laptops).[/citation]

The mobile i7/i5 Nehalem left a sour taste in my mouth, leaving me distrustful of i7s. I have an i7 720qm (1.6 GHz four cores), which had been consistently outdone by the cheaper high end mobile i5s unless if the workload had 8 threads.

And the high end i5s could be OC'ed and have their voltage adjusted unlike the i7s, which was another slap in the face.
 

teh_chem

Honorable
Jun 20, 2012
902
0
11,010
[citation][nom]blazorthon[/nom]Whether or not future IGPs from Intel will be enough for modern entry-level gaming doesn't change the fact that the HD 4000 is kinda poor for gaming, admittedly it is playable when the drivers don't fail. It would be like playing a Radeon 6450, except with far inferior drivers. If it is fail at gaming, then it doesn't make sense to put it in situations where its poor drivers would give bad reactions from gamers whom try it, thus hurting Intel's reputation.It isn't ready for that yet, so Intel didn't put it in that situation. As much as I'd rather have HD 4000 on an i3 if I buy one, at least there are good reasons for it not being there.The argument of supplementing CPUs with inferior integer performance and FPU performance with a superior or similar GPU than their bigger brothers is also somewhat flawed. It would mean that, at best, the i7s would have a substantial GPU frequency advantage and Intel might lock the frequency in the lower end models so low that it would only be somewhat better than an HD 2500. Intel would also only do this with some models and would charge a premium over the models that have the same CPU performance with an HD 2500. Intel would need some way to make up for chips that have faulty graphics and that would be it.So, they i7s would still have a substantial GPU performance advantage even if Intel gave the HD 4000 to lower end models. Keep in mind that Intel's HD graphics of the same model still have frequencies that can be literally all over the place with different CPUs.[/citation]
Ah--I thought the GPU was clocked at the same speed regardless of processor? I mean, say, HD3000 cores on a 2500K are not necessarily running at the same speed as on a i3-2100? I thought the only difference between the HD4K and HD2500 was the number of shader cores--16 vs. 6. Is that not the case?

Regardless, (and output quality aside), software encoding on CPU cores alone is still not as fast as quicksync on HD2500--and quicksync on HD2500 is about half as fast as on HD4k. I'm not understanding why it's flawed to use a more advanced GPU chipset for gpu-accelerated tasks?
 
[citation][nom]teh_chem[/nom]Ah--I thought the GPU was clocked at the same speed regardless of processor? I mean, say, HD3000 cores on a 2500K are not necessarily running at the same speed as on a i3-2100? I thought the only difference between the HD4K and HD2500 was the number of shader cores--16 vs. 6. Is that not the case?Regardless, (and output quality aside), software encoding on CPU cores alone is still not as fast as quicksync on HD2500--and quicksync on HD2500 is about half as fast as on HD4k. I'm not understanding why it's flawed to use a more advanced GPU chipset for gpu-accelerated tasks?[/citation]

For HD 3000, the difference between an i3-2105 (the 2100 has HD 2000) can have the i3 as more than 20-30% slower. I think that this frequency can be overclocked, but I might be wrong.

Your point in that using a more powerful GPU for GPGPU tasks is not flawed. However, letting a cheaper CPU be too close to a more expensive model of a higher end range can ruin the point of that higher end range. For example, if Intel made an affordable LGA 1155 six-core CPU, the more expensive LGA 2011 platform would drop in importance (and profitability) substantially. Using better hardware for a task is not flawed, but using high end hardware in low end setups can be a flawed concept because the higher end setups lose value relative to the low end setups if this happens.

What Intel could do is release a few models that have the higher end GPU at a premium over the same CPU with a weaker IGP, like I said and like they do with some i3s and i5s for Sandy Bridge. However, letting all of their lower end CPU lines use the top end GPU would decrease the value of Intel's higher end CPUs. The argument that I made about the higher end GPU not being ready for consumer gaming and such because of poor drivers is also considerable. Why get incentive to use a product in a way that it isn't ready for by giving people the opportunity to do so, thus giving people the opportunity to be pissed if it doesn't work?
 
[citation][nom]A Bad Day[/nom]The mobile i7/i5 Nehalem left a sour taste in my mouth, leaving me distrustful of i7s. I have an i7 720qm (1.6 GHz four cores), which had been consistently outdone by the cheaper high end mobile i5s unless if the workload had 8 threads.And the high end i5s could be OC'ed and have their voltage adjusted unlike the i7s, which was another slap in the face.[/citation]

I admit that Nehalem was really just very poorly thought-out and organized. It was all over the place in many ways and Intel could have done a much better job of it. Sandy Bridge is a much better job of it, although I don't like the hindering of low-end overclocking that Intel put into high effect with Sandy.
 

A Bad Day

Distinguished
Nov 25, 2011
2,256
0
19,790
[citation][nom]blazorthon[/nom]I admit that Nehalem was really just very poorly thought-out and organized. It was all over the place in many ways and Intel could have done a much better job of it. Sandy Bridge is a much better job of it, although I don't like the hindering of low-end overclocking that Intel put into high effect with Sandy.[/citation]

Actually, Intel also crippled OCing for even the highest end i7s. A high end mobile i7 Nehalem OC'ed properly will outperform a high end mobile i7 SB or maybe even IB.
 
[citation][nom]A Bad Day[/nom]Actually, Intel also crippled OCing for even the highest end i7s. A high end mobile i7 Nehalem OC'ed properly will outperform a high end mobile i7 SB or maybe even IB.[/citation]

Yes, but using a helluva lot more power while doing it. Also, some of the top-end mobile i7s from SB can still be overclocked too and some IB models probably can as well.
 

A Bad Day

Distinguished
Nov 25, 2011
2,256
0
19,790
[citation][nom]blazorthon[/nom]Yes, but using a helluva lot more power while doing it. Also, some of the top-end mobile i7s from SB can still be overclocked too and some IB models probably can as well.[/citation]

From laptop websites that I read from, they found it very difficult to OC an SB compared to the older i7s.

And I don't think they really cared about power consumption, especially when many of them turned their laptops into stationary laptops.
 
[citation][nom]A Bad Day[/nom]From laptop websites that I read from, they found it very difficult to OC an SB compared to the older i7s.And I don't think they really cared about power consumption, especially when many of them turned their laptops into stationary laptops.[/citation]

How can it be more difficult? It should be as simple as raising the multiplier because the top SB i7s are multiplier-unlocked.
 
Status
Not open for further replies.