Intel's Future Chips: News, Rumours & Reviews

Page 23 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

the i5 5250U is more thermally constrained than the 7400k, though. intel needs the edram as well, to fully uncork it's gaming performance. and...17 fps vs 13 fps... both are crap.
 


Broadwell is a very minor incremental improvement over Haswell. Nothing new here. Let's wait for Skylake, which is where the next arch is being developed for.
 

i5 5250U is a dual core soc with hyperthreading. thermally so limited that it throttles easily under a load. temps weren't included but i think it runs near 90-100c or higher during gaming. if you're comparing the cpu, look at the transcoding and rendering benches. from the gaming benches, i think intel still hasn't gotten something like powertune/boost for the igpu.
 

oh kay... i'll still ask: according to you, intel has nothing to show for what exactly? unless it was a rant - in that case i regret replying in the first place. :)
 


Funny, I never viewed Broadwell as being a game changer, though plenty of others here did. Wait until Skylake, since that's the new uArch.
 


Roumered by who? I never really strayed much from my 20% prediction. Which it did meet in synthetics. Early reports shows Broadwell performing about where I predicted it would.
 


Except well market share.

Power consumption is one area they are focusing on the most. As for graphics well even the 4600 is enough for most people any improvement will be considered good. Anyways why would it be a waste if they went to kaveri level graphics? That's a pretty amazing achievement.

Also comparing 65 watt CPU's to 15 watt ones targeted at different markets is not the right way to compare products.
 

i looked into intel's igpu performance from hd5k to hd6k with the tr article. several things stood out:
■ intel's igpus showed performance levelling off from hd4400(GT2) to hd 5000(GT3) even though hd5k had 2x more shaders (24 vs 48 eus, bdw GT3 has reworked 40 eus afaik). hd5k igpu was in a 15w soc like i5 5250u. the i5 5250u is either thermally limited or memory limited or cooler limited or all of those. i think overall obtainable performance from these igpus are hitting a wall somewhere, throwing more shaders isn't the answer anymore. the tr review also used ddr3 1600 ram. it's the default ram for bdw but i woulda liked to see possibility of memory scaling.
■ drivers aren't mature enough. in intel's case it may take 6 months to a year.
■ the tr review shoulda used more games, especially gpu bound games for a better performance overview. using thief and then jumping into subjective analysis only game a good idea about the subjective analysis (though it somewhat matched my experience with intel's igpus).
■ intel really doesn't have anything to show for bdw's igpu because it was difficult enough already to get it out of the fabs and then [strike]tricking[/strike] convincing unwilling oems into building pcs around the socs/cpu.
■ the rumor seems to extremetech's expectation only.
■ i'll need more info before i think intel is doomed. to me, intel's so-called hd gfx is really video processor with entry level compute capabilities that does web browsing and light gaming. from what i've read about bdw's gpu, that hasn't changed yet.
 


Casuals i can say for a fact that when doing normal things like Youtube or netflix in HD its smooth. Plays casual games fine. That's what i mean for most people.

Think of it this way Intel can spend all the money and time on power consumption and graphics since their CPU is well ahead of the competition(which i'm convinced Intel doesn't even see Amd as competition anymore). Intel is fighting the one who is really beating them in areas and that's Arm.
 


So, a guy on Extremetech says so, so it must be true? Maybe I should start a website, and say "I'm expecting 20% gains because that's what Intel was able to do every generation going back to SB". Only difference between us would be me being the one that's right though.

Seriously, "Some guy on the Internet said" isn't going to fly.
 

that has been true since llano. however, intel can compete in terms of performance, even price/perf. but they will risk internal cannibalization at mobile level if they try. you might see this when the arm socs with a53/57 cores and hevc hardware support becomes commonplace especially the ones from mediatek e.g. 8x a53 with a barebones gpu with 4K and HEVC support. intel can theoretically build something like a 6c(2.5GHz turbo) atom with 48 eus and edram right now, but it'll really hurt other cpu sales and profit.

and... weparrotothersourcesoftech.com isn't reliable as a source.
 


The sites you link are not even remotely credible i'd take gamers word over them any day. These are the same sites and authors who claimed bulldozer would be 50% faster and claimed the original phenom was going to beat Intel quad cores.

Gamer is simply using history to back up his claim of 20% faster.
 


As the majority OS in servers Unix what else?
 
I am going to ask you thread regulars to tone it down a notch (or five) before this gets out of hand. Attack positions, not each other. Fight with facts and sources, not names, profanity, and insults. Be polite or be gone. No need to get emotional about any of this.

Don't make me pull the car over... Thank you for your cooperation and consideration of others.
 


both of you are pointing to different entities ( rumour and self )
tourist is pointing towards rumours and not towards gamer's prediction
so it is pointless to discuss if anyone said this or that or not
 


Intel has a CPU with a 15W TDP that performs 24% worse (in Thief) than the 65W AMD A6-7400K. In other words, Intel has some pretty incredible power efficiency to show for all the R&D money it has spent.

Intel has created a processor that will perform a bit worse in 3D gaming, but allows for much better battery life on a smaller battery. Smaller battery means lighter notebook and better portability.

Now, this part is just conjecture, but I would be willing to bet that, if Intel wanted to, they could design a 4-core CPU with a much better iGPU on a 28nm chip with a 65W TDP or power draw. Intel is not going for raw power though, they are throwing R&D money into shrinking die size and improving efficiency while still providing incremental performance improvements. That is exactly what they are achieving, so I would say they have a lot to show for their R&D money.
 


And when they WANT brute power, we have the LGA-2011 series socket to provide that... not in the iGPU department anyways. It's more Cache based.
 


True, nether is proven to be correct. Obviously, but you like others here need to understand what marketing slides are nothing but claims without evidence, Intel, Amd, Nvidia do this all the time. Amd actually hasn't been as of late i noticed.

Not even sure why they do it they get caught and i'm not sure why Intel does it when they don't need to.
 
It's for the people who are addicted to marketing and so PR gets a job. If PR can make the marketing LOOK good, it gives em a job. Atleast, thats what I think. Then again, I have never taken business classes and done little in that area. 😛
 

by o.s. do you mean the arm socs in laptop and dt form factors? they'd likely be either chrome o.s. or android... may be one of those arm v8 compatible linux distros (unlikely).
as for consumers, the same ones that buy chromebooks. i don't know much about streamers but something like ouya console with upgraded soc... i think. and tv dongles media streamers. intel is already competing in media dongles with low power atom socs. basically even moar types of content consumption devices.

edit: willinglyplagiarizefacetsoftech.com is unreliable in the sense that it misunderstands and sometimes outright blunders. when they parrot they're barely ok, but when they add their own stuff they mess up. so using them in arguments or as source can lead to undesirable outcome. speaking from past experiences LOL.


intel doesn't have a 28nm process, it's 22nm is a full node shrink from 32nm. amd uses glofo's 28nm bulk which is er.. less than half node shrink from it's 32nm SOI process but seems to offer quite high density. while there is a lot of P.R. trickery in naming a process... intel still couldn't design a much better gpu (i.e. a graphics processor unit) than amd's. little to do with fabrication.

edit2:
Intel desktop Skylake CPUs delayed
http://www.cpu-world.com/news_2015/2015021801_Intel_to_delay_desktop_Skylake_CPUs.html
windows 10, windows 10, does whatever a windows 10 can...
 


So you don't think Intel could drop back to a 22 or 32 nm die size and use all that extra die space to brute force their way to a solid iGPU if they wanted to and didn't care so much about efficiency? That's the assumption (could very well be wrong as it is just a guess) I was making.
 

heh, i assumed something similar when hd4k came out in ivy bridge lineup.
i think that during 32nm and ivy bridge(22nm), intel didn't have scalable uarch for the igpus. second, it was squarely aiming for laptops and ultracraps. it had to get low power use, on-die integrated gpu, power management and other hardware related stuff right all at once. and it wasn't even a gpu company like amd(ati) and nvidia. so instead it started small and gradually tuned bits and pieces of the gpu as any chip designer would do. i don't know much about larabee, but there's that too. most of intel's results got shadowed by er... marketing(!) but it did put in some effort. during haswell, finally had enough for brute force and thanks to apple we got to see the GT3e at least a gen early IMO. the GT3 part is actually quite inefficient and comes close to brute force approach. for example, intel can sell an i5 5250 as a desktop 65W core i3 with iris gpu. for most people it'd be more than good enough but it'll cannibalize i5/i7 +GT2 cpus.
 
Status
Not open for further replies.