Intel's Future Chips: News, Rumours & Reviews

Page 140 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


This is at one company though.

It could cause a change but I find it hard to see s price fluxuation like this causing it.

What will be interesting though is now that AMD is moving GPUs to TSMC if that will affect price. I imagine TSMC will not be able to fill both orders without a bit of strain.
 
Uhm... Well, the biggest data centers are still in the USA, so it makes sense from a profit perspective to do that.

I haven't noticed any spikes in prices, since I don't need a new CPU, but I did see some news around i7's going up in price by quite a lot.

AMD has the perfect opportunity to regain market, so they better take it... I don't think Intel is going to give them a second window of opportunity like this one in a very long time.

Cheers!
 

aldaia

Distinguished
Oct 22, 2010
533
18
18,995
Shares of Intel were up nearly 5% after a research report (from an analyst at BlueFin Research Partners) says that Intel could ramp up production of its 10-nanometer processors sooner than the expected June 2019 timeline.

www.businessinsider.com/intel-stock-price-report-could-speed-up-10-nanometer-chip-production-2018-10

Edit: Intel has neither denied nor confirmed the report.
 


Its the best way to be profitable. AMD can have more consumer market share. I know Intel does not care nearly as much as the server and HPC market because profit margins are vastly better. Same with GPUs.
 


You're wrong there, Jimmy.

Intel DOES care about it. It still makes a lot of money out of consumer and it's not a market you can scoff at. Remember that consumer market includes commodity computers to high end luxo-epeen-beasts. OEMs in that vertical pay a lot still. Given the size of Intel, it is just not acceptable for them to drop the ball in either market with the dominance they have.

And I can absolutely imagine they can get their act together and push 10nm back into an acceptable-ish schedule. I mean, they have piles and piles of money to throw at the problem. If there was a time to burn that pile for something, this is one of the best.

Cheers!
 


I am not wrong. Intel can afford to lose more consumer market than they can server/HPC. The margins in server/HPC vastly outweigh their consumer market margins.

I am not saying they don't care at all but if they lose some they are not going to cry. Now if they lose that nice 99% server market share they would cry a lot more as their margins would drop and even having 100% of the consumer marketshare will not make up for that.
 


I think the real question is if Intel should just skip 10nm and go all-in at 7nm instead. Right now, Intels schedule has 10nm for Q3/Q4 2019, which is awfully late to the game. You can argue Intel may as well redirect that money to 7nm, and quite possibly release a new CPU arch alongside.

The other question, knowing die shrinks are coming to an end one way or another, is whether it makes sense for Intel to maintain it's own fabs now that they aren't a clear benefit for Intel.
 


I would say yes but then there is the rumor that they might have 10nm out earlier than that.

As for the FABs. I think there is still a benefit to them. They have complete control over QA and wont run into the same issues AMD would such as a company failing to deliver (GloFlo 7nm) or competing business not allowing them to push out as many finished products due to the FABs only being able to do so much at a time.

I mean TSMC is probably going to run into some constraints if they are pushing AMD CPUs/GPUs and nVidia GPUs out.
 
https://elchapuzasinformatico.com/2018/09/intel-core-i7-9700k-review/

Take it with a grain of salt but if true it looks like the 9000 series will be decent. Need a good full review though. This shows it beating the 8700K and keeping up with/beating (barely) a 2700X. To be fair Intel is capable of better clock speeds at the moment and looks like this will push to 5GHz although the review stated 1.4v and I am not sure I am super comfortable with that voltage long term.
 
I'm not liking the numbers, TBH... 240W at full load? That's kind of high, isn't it? Even more, this is the non-HT version.

Intel did what it had to do at the very least: get rid of the toothpaste and clock it to the clouds, power be damned. They are pushing their 14nm to the limit with the i7 9700K and i7 9800K (9900K?) it seems. They have zero "brute force" margin of improvement for the next generation.

They are stuck between a rock and a hard place this gen and you can see it. Sure, AMD is squishing all it can from the 2700X this generation as well, but they were playing catch up, so they are expected to do so. And even then, when you analyze the CPU itself, it's a marvel on its own right using an arguably lesser node process and packing that many cores AND reaching, what it seems, IPC parity at times.

Now here's the kicker, and I can't even believe I'm phrasing it like this: Intel will have to face the successor of the 2700X with the 9K series. They have "wide" parity with the 2700X at the same price point, but AMD is about to push the next gen and they won't have any mercy. From the rumours aldaia has been posting, if AMD closes the gap clock-wise, Intel will have a hard time justifying the new price points.

Cheers!
 


That 240W is with auto voltage with stock clocks with an ES sample. This is not a true review. When they went and set the voltage manually to 1.3v and it was stable it hit 130W at stock clock settings. It looks like the UEFI is overvolting the chip, something those auto overclockers tend to do as well.

Once there is an official review with a non ES chip and a non ES BIOS we can see if those numbers change. However since the majority of people who will buy a chip do so with the intentions of overclocking the stock voltage settings are not anything to worry about.

I have a feeling that the AMD next gen CPUs wont do much in terms of performance gains. CPUs have really hit a brick wall. Intel was already there and as you stated AMD is playing catch up. However if AMD meets or slightly beats Intel overall, and not in just a few tasks, then don't expect them to keep the pricing so low. History has shown that when they can they price matching the competition. I wouldn;t be surprised if the 3000 series (or whatever they call it) is priced to match the 9000 series and then they might even start matching Intels HEDT with Threadripper.

Or course all this is dependent on them being able to match performance and that the 7nm process they are going to use does it well.
 

aldaia

Distinguished
Oct 22, 2010
533
18
18,995
https://www.patreon.com/posts/21950120

The benchmarks carried out by Principled Technologies are even more bogus than we first thought. A few viewers pointed out that the Ryzen 7 2700X was listed as tested in the “Game Mode” within the Ryzen Master software and I foolishly thought they might have just made a simple copy and paste error in their document as they would have used this mode for the 2950X. This does explain why the Threadripper CPUs were faster than the 2700X in every test.

What this means is a CCX module in the 2700X was completely disabled, essentially turning it into a quad-core. I’ve gone ahead and re-run the XMP 2933 test with Game Mode enabled and now I’m getting results that are within the margin of error to those published by Principled Technologies.

1

1

1
 
I'll say this: This is a bad look regardless, but I wonder if Intel specifically had them do this, or if Principled Technologies did this on their own.

And no, there's absolutely no legal case here; what was done has been documented and replicated by others. Even if it is shady as all hell.

Waiting to see how long it takes Intel to throw Principled Technologies under the bus.
 


I think it was the testers. I looked at the AMD software and it really doesn't state its reaction for non Threadripper CPUs. I think the software itself needs to be tweaked as Windows can pull the CPUID and tell the software not to allow "Game Mode".

This app is made for all CPUs but that part is specifically designed for Threadripper. I don't think Intel would be stupid enough to tell them to disable half the cores to look better when the 8700K already is the best gaming CPU right now. I mean how would the 9900K drop considering they now have STIM and still a clock speed advantage?

I think it was ignorance of the software itself and a fail on the softwares end too. I actually wonder how many people turn that on thinking it makes their CPU perform better only to lose performance?
 


I wonder how many people are bashing Intel without knowing what "Game Mode" even is in this case? Heck, I wonder how many of them also have Game Mode enabled?
 
I have a 2700X and I've never found a "game mode" option. The Ryzen Software needs to be downloaded separately and it's bundled with warnings. Your average user WILL NOT use it, I can tell you that much.

Hell, even I took a look at it and found it useless, haha. I still use manual core-parking (affinity).

As for the fact itself of the numbers being skewed... Well, who in good faith trusts marketing slides on performance other than "ball-parking" it? And, on the other hand, do you think the external provider did this without lawyer guidance? I'm pretty sure they had all tests audited by a big group of lawyers XD

Cheers!
 


138886-ryzen-master-ui-system-monitoring-1260.JPG


It is there. That is a screen from AMDs own site about it while using a 2700X (look in the upper left) and the Game Mode tab is at the bottom.

This utility is the one designed by AMD for overclocking. And you are correct plenty wont use it, most OCers will use the BIOs. However I am sure if you download drivers this also comes up.
 
A company including a "Game Mode" that cripples the majority of their mainstream CPUs deserves to have that exploited by their competition. I'd be pretty darn pissed if I had my gaming performance crippled by enabling "Game Mode." The big takeaway here shouldn't be Intel fudging benchmarks (that's somewhat expected, you want real benchmarks wait for independent third-parties) it should be AMD's Game Mode crippling CPUs. Intel doesn't specialize in accurate benchmarks, they specialize in selling CPUs.
 


The idea behind Game Mode is you enable fewer cores, so you can clock higher, leading to increased performance in titles that scale across fewer CPU cores. We've literally been discussing this for over a decade now; remember the E8600 versus Q6600 debates we used to have? The real issue is titles using more cores, Game Mode becomes counter-productive. [Note: Ideally the CPU handles what Game Mode is doing on it's own; not sure why it isn't TBH.]
 


I meant that "out of the box". The BIOS'es don't come with a "game mode" option in them and, like I pointed out, you have to actually know Ryzen Software exists as an additional piece you can get and THEN download it.



It might make sense for Ryzen still, since disabling a CCX for games avoids Windows putting it in the adjacent CCX; this might really not net significant gains, but the theory is still there.

Now, when you know the game tested can go as wide as 6-8 cores, disabling the CCX instead of the threaded cores is dumb, yes.

Cheers!
 


It's been a few years since multi-threading has increased performance in major game titles. Once again, you're just telling me how AMD has made poor choices with their products (software included).

If anything disabling SMT would be better than disabling half the cores. Their Game Mode just seems like a poor choice all around.

The 2700X was slower in gaming than the 8700K and the 7700K. No one should actually believe it would be faster than the 9900K. Intel really had no reason to even compare them in gaming. I say the decision to benchmark it in gaming against the AMD is a poor choice on Intel. They had no reason to do that, and for that reason alone I would say shame on Intel.

Expecting Intel to use anything less than a completely stock AMD system in a comparison with an Intel system is a pipe dream.
 
If anything disabling SMT would be better than disabling half the cores. Their Game Mode just seems like a poor choice all around.

Note quite in this case; the issue is that moving workloads between CCX's can cause some performance loss, so as Yuka noted, there are valid reasons why you'd prefer to disable a CCX.

It's an architectural design choice; there is performance loss when you have unused CPU cores, but you gain more performance in tasks that do scale. Hence, "Game Mode".

Still waiting for Intel to throw Principles Technologies under the bus.
 
Status
Not open for further replies.