Intel's Future Chips: News, Rumours & Reviews

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Chad Boga

Distinguished
Dec 30, 2009
1,095
0
19,290


Sadly no numbers, just impressions.

Until it gets benched properly, I wouldn't be assuming it is the equal of the GT650M.
 


Plus, one was a notebook and another was a tower/case. Totally different cooling capabilities.

If you ask me, I'd even give the kudos to the lappy here for being so awesome against a tower like that.

Cheers! :p
 
Trusted source confirms soldered-on [strike]small-mobo-vendors-destroyer[/strike] Broadwell CPUs
http://techreport.com/news/24191/trusted-source-confirms-soldered-on-broadwell-cpus
looks like the cpu+mobo will become the new gfx card. stay tuned for stuff like
msi twin frozr vii core i7 5700k,
asus ares i5 5670k @ 5 ghz
gigabyte windforce v oc 5800k @ 4.9 ghz
jetway superhipster core i3 5210 (with eye-bleaching yellow ramslots and orange pcie x16 slots)
asrock oc formula/fatal1ty 5700k 4.8 ghz (slightly warped)

haswell gt3 vs gt650m tr ver.
http://techreport.com/news/24187/haswell-integrated-graphics-keeps-up-with-geforce-gt-650m

edit:
Intel Haswell Overclocking Preview – The Return of BCLK Overclocking?
http://www.hardcoreware.net/intel-haswell-base-clock-overclocking/
 
from what i've read, haswell's highest desktop igpu has only 4 more shaders than hd4000 (assuming that's the reason it's called hd4600 in the 'leak's). there's no way that 4 more shaders can add triple digit performance improvement. i suspect that intel is comparing gt3 igpu (in a desktop enclusure to allow higher thermal limits) to ivy bridge's hd2500. then there's intel's crappy drivers. they improve slightly ... but the problems never seem to go away.
 


Not necessarily. Remember that, for example, NVIDIA and AMD have different shader counts, the big difference is how much work each shader does. If each shader got a significant performance boost, given how Rasterization scales well, you can easily expect an exponential performance increase.

Granted, I want numbers, but given how it apparently gets performance somewhere in the neighborhood of the GT650, is a promising sign.
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810
^ AFAIR from the Anandtech coverage, the design of GT3 is very similar to the HD4000.
Only the total number of shaders has increased to increase performance (and even then, they work at slower speeds to conserve power).

And Anand more than hinted that real changed will come in Broadwell. (which has been countered to some extent by Charlie)
 

got it. but the leaks/rumors that i've seen don't show any significant change/improvement in individual shader design. seemed like haswell will have similar igpu design as ivy bridge and haswell gt3 only getting higher number of them .... according to my rather limited knowledge on design.. :)
in the gt3 vs 650m demo, intel kept the gt3 build inside a desktop chasis. now, if it was the same stunt amd pulled (trinity laptop hidden in a dt case running multi monitor), intel coulda just shown the innards. it seemed more like gt3 gets really hot on load and needs more aggressive cooling to perform similarly to gt650m.i noticed that no one seemed to be actively interested in what's inside the case.
still this could be very good news for intel based performance all in one pcs.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


Development/debug kit hardware is often much bigger than the shipping versions. I wouldn't read much into that. Laptop form factors have routinely cooled much more demanding discrete GPUs.

Also it's good to know how far they can push it with a higher end cooler if they do release a desktop version of GT3.
 

true, true. this is sorta what came to my mind first. i ignored it thinking it was dev kit/prototype hardware/it's just a demo, and stuff. then i tried to remember past events. back when intel showed off working laptop prototypes yet they screwed up with fake video. later they had anand play the games to verify their claims lol. that's why i thought that it's intel after all. something's gotta be amiss. :D besides, intel should have stockpiled enough haswell silicon (rumors/engineer-on-reddit-ama suggest so). so why didn't they show off working laptop/ultrabook prototypes playing the games like they did with ivy bridge? is it because they want to launch u/y series haswell in june? it's not like amd will suddenly come out with kaveri and pwn them. <end rant :bounce: >

i am all for gt3 on desktop :love: . even if it drives up tdp from 84w. amd's top dt apus are 100w so that's not a factor. i was disappointed to see all dt cpus 'rumored' get puny gt2. may be intel got too lazy and decided not to include 40 shaders on the dt silicon.
shaderrage! :(
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810
Sony_Core_i7_phone_prototype.jpg


Smiling CharlieD for you all.
http://semiaccurate.com/2013/01/15/prototype-intel-core-i7-phone-platform-spotted-and-tested/

And it is incompatible with ARM based Android apps. Luckily, it can emulate 27 ARM A9 cores with ease,
 
this is so funny (to me)
"Was Intel purposefully decieptful at CES?"
http://semiaccurate.com/2013/01/15/was-intel-purposefully-decieptful-at-ces/
"...software will run badly, faster."
"US starts to feel like a 3rd world country"
"McAfee will be mandatory."
and moar.
i hope that the gt3 demo gets the same treatment. :love:
 
G

Guest

Guest

Sony_Core_i7_phone_prototype.jpg


needed to remove the S in httpS:
 

ph1sh55

Distinguished
Dec 29, 2011
38
0
18,530
Are the writers at 'semiaccurate' usually that seething and ridiculous? I hope that doesn't pass for journalism nowadays.
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810


He's got some harsh views on things but some of what's written is just to bait the copycat sites. Plagiarism is rampant these days. Most don't even try to hide it.
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810
1 month old news :

Broadwell will only have a PCIE 2.0 x4 interface (most prolly the mobile chips).
Reason : so that OEM's wont use high performance dGPU's, and will be forced to buy high priced Intel ' next gen GT3' graphics, prolly with Crystalwell.
 
If this is true, it still comes down to costs and structures/HW set ups.
Good enough gfx hasnt been seen yet, as it still has the potential for much growth, for at least 2 more to three nodes yet.
Time will tell on all this
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810


:lol: this article reads completely like a Troll post.
 
Status
Not open for further replies.