Intel's Future Chips: News, Rumours & Reviews

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810


If you believe Intel, it will be "miles ahead. True revolution in graphics, with HD gaming."
If you believe AMD , "Trinity beats all competition. With Richland, we will be aheader"
If you believe Nvidia, " we have Nvidia optimized games for all platforms, which will suck on all other hardware".
 

skaughtz

Distinguished
Sep 7, 2011
78
0
18,640



I believe in nussing, Lebowski.



... but I'll take that as a "no." I skimmed through the links posted in this thread, but I couldn't find anything definitive. Oh well.
 

here's what i know so far. add 'rumor' in every sentence and take 'em with a grain of salt.
desktop haswell igpu will be hd4600, 20 eu(shaders) 4 higher than ivy bridge's 16 shaders in hd4000. that one will probably have minor improvement over ivy bridge. if there's more it will be from mature(LOL) drivers and design tweaks/improvements.
there is another haswell igpu variant named gt3 (gt2 being 20 shader hd4600 and gt1 ia the least powerful one) with 40 shaders which, so far seem to be exclusive to ultrabook skus. in theory, 40 shaders in the igpu can deliver massive performance, by intel standards. but, it looks like the extra shaders will be used for better power efficiency in ultrabook-class devices. intel did do a 'demo' of gt3 vs nvidia gt650M in ces showing gt3 being close to 650M(bragging ensued). at this point, it's unclear whether there will be gt3 based skus for desktop and mainstream laptops.
as for faring against amd igpus, hd4000 can catch up to llano's 6550d igpu in some non gaming tasks while trinity's 7660d igpu can run circles around it. i don't think gt2 will outperform 7660d but gt3 might have a chance if it's given desktop/laptop class thermal headroom.
 

truegenius

Distinguished
BANNED
if putting more shaders can give a linear performance boost then amd would have already done this as they have plenty of room for tdp in 5600k
but it is not linear

question :- intel's eu is cluster of shadders or shadders only :/ any technical specifications about it ?? :??:
 

http://www.realworldtech.com/ivy-bridge-gpu/
 

RussK1

Splendid



If proves to be true that the next gen. consoles are to use AMD that leaves nVidia the odd man out. Serves them right... So much for PhysX!
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810


Just to add a bit, that the Ultrwide RAM a.ka. Crystalwell is real. But as CharlieD says, its pricing makes it very difficult for OEM's to integrate.
 

melikepie

Distinguished
Dec 14, 2011
1,612
0
19,810
Will Haswell make my water cooling a waste with it's fancy power saving features? I mean it's stupid, cant there be some setting in the BIOS for overwatting my CPU?
 
i guess, go to bios > set current, power limit to as high as you can > go crazy with the voltage settings.
haswell will likely have the same ever-smaller-cpu-space-on-die issue as ivy bridge, so water cooling would be needed for high overclock....
although, i don't think intel allows their cpus to use so much power that they fry instantly but gradual degradation is quite possible.
otoh, i think amd unlocked cpus can be 'overwatted' quite easily. amd's are fully unlocked. :D

Core i7 4765T is 35W quad Haswell
http://www.fudzilla.com/home/item/30339-core-i7-4765t-is-35w-quad-haswell
Intel [strike]the ones with BSdp[/strike]Y-series Ivy Bridge lasts until Q4 13
http://www.fudzilla.com/home/item/30338-intel-y-series-ivy-bridge-lasts-until-q4-13
 

lazykoala

Honorable
Feb 4, 2013
137
0
10,710
Haswell will be compatible with all current and soon to be released GPU's right? I don't want to buy a GPU now and end up having it be incompatible with a new CPU in a few months!
 


Very much doubt that; PhsyX isn't going away for no other reason then no one else had bothered to create a dedicated Physics engine as capable as PhysX yet. Wouldn't be shocked if DX eventually adds a physics API that looks VERY similar to PhysX in layout. [Hey, SM 1.1 was basically NVIDIA's proprietary Shader Model implementation, so it wouldn't be the first time...]
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810



Not sure how 'capable' Physx is. IIRC, it uses ancient x87 instructions to cap performance on modern processors.
Most gamers would be ecstastic if DirectX adds a vendor agnostic GPU accelerated physics engine...
 


*Used*. The API was re-written as of Version 3 (May 2011 or so if I recall).

And in terms of what it is reasonable able to simulate at a reasonable speed (at least on a GPU, which IIRC is where dynamic physics *should* be computed on), nothing comes even close.
 

ph1sh55

Distinguished
Dec 29, 2011
38
0
18,530


The CPU won't cause an incompatibility with a discrete GPU, you don't have to worry about this. Any motherboard for the foreseeable future should have a pci-e slot for your GPU .
 

RussK1

Splendid


HAVOK... favored by far and uses SSE instructions.
 


I highly doubt that Intel's new processor will have AMD-designed graphics. NVIDIA would be more likely but very unlikely to happen. I bet they stick with the current stuff they're running until/unless they end up buying NVIDIA in which case they'll roll in GeForce GPU technology. Intel would actually do much better to hire programmers who can write a GPU driver worth half a crap than to throw more hardware at their GPU problems right now.

Also, you might want to work on your spelling and capitalization. Tom's isn't a text message or Twitter, you can use regular English and aren't limited to 140 characters :D
 
Intels iGPU is like a spray gun, in few isolated tasks it is comparible but in general HD4000 is at least 10-15% slower than that on a 3850/3870k and around 40-45% slower than that on the 5800k. GT3 will probably match the high end Llano parts in gaming and Trinity's 7660D in some tasks. Problem is the expenses ratio and eventual SKU pricing for GT3 parts. It will only be on the high end i7's and i5's which cost 2.2 and 3.5x the cost of a 5800K and manufacturing wise even more.

I would say the true improvement is around 30% give or take but in some games it still falls flat, almost like the developers don't recognise intel HD with there engines. And NO |Skyrim| doesn't max out not by a long shot.

Where Intel will probably beat AMD is in the mobility front, with AMD still working on power issues on mobility, AMD integrated solutions on mobile platforms is very cut back on. Hopefully with Richland and Kaviri AMD will lower the thermals to throw more stream processors on to a very skint mobile solution.
 

ph1sh55

Distinguished
Dec 29, 2011
38
0
18,530
if june 2nd is the general launch it will likely be on that timeline or after 1 week. K sku's don't usually trail the initial launch, at least I don't think they have in years past.
 
Status
Not open for further replies.