Intel Haswell iGPU to support DirectX 11.1, OpenGL 3.2

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]waethorn[/nom]You won't see this in Windows 8 systems because Microsoft doesn't allow switchable graphics in OEM-built hardware. They see it as a cop-out for poor power management in the higher-end GPU, and they're right - the GPU should be able to dynamically adjust power requirements by demand. Why should your NVIDIA card draw more power than anything else for just running Aero and Metro when an ARM processor GPU is far less powerful and it can already do this?[/citation]

Actually, Intel doesn't supply the driver for it, it HAS to be integrated and by the OEM into their own driver... If it works in Win 7, why not Win 8?
 

amd apu prices have decreased after they launched. got even lower when new k series apus launched.
iirc intel's cpus e.g. core i5 2500k are selling at a slightly higher price than their launch price. these cpus sell at a higher price because nothing from amd can compete with those in terms of performance and otherwise. however, amd's current fx cpus are overpriced for their performance and power efficiency.
you should get out of your distorted reality and check with the real world sometimes.
i do my own research and i am not paid by anyone. you on the other hand, seem to be drowning in amd.... :)

name calling eh. that's a proven c.a.l.f. right there. 😀
according to your own incorrect logic, one can accuse you of being paid by amd. did you realize that?
you're right, consumers are too smart for your nonsense to work and will buy the products they see fit.

true@ pentium's multithreading weakness. then again, if you're running workloads like blender, handbrake, cinebench on an entry level cpu, you're not gaining anything other than test results.
my issue with llano is not their performance (it's amazing how amd delivered such a well-performing cpu+igpu combo at the price level they sell at), but their upgradability. so far i've read conflicting rumors about trinity retaining socket fm1 compatiblity. if trinity ends up being incompatible with socket fm2 - which i believe is quite possible, llano owners would be left without an upgrade path. time will tell, i guess...
 
[citation][nom]danbfree[/nom]Actually, Intel doesn't supply the driver for it, it HAS to be integrated and by the OEM into their own driver... If it works in Win 7, why not Win 8?[/citation]

Any OEM system that ships with Windows 8 will be logo certified, and Microsoft doesn't allow it as part of the new logo requirements. They only allow a single target GPU for output. This works for SLI and CrossFireX, but not switchable graphics. The graphics driver in Windows 8 also changes to a new WDDM version (1.2 if memory serves) for DX11.1, so I would guess that Windows 7 graphics drivers won't give you full performance or features, nor will they load without giving you WHQL warnings. You won't see Optimus or Hybrid SLI for power management in these systems. I believe that there is some kind of new thing in WDDM 1.2 that also requires that multiple video output ports also have to be handled by one single GPU entry instead of showing 2 or three like in previous Windows versions too.
 

It reports the average value too. It reports my proc's TDP as 95W, when in fact it has a max TDP of 142w and a min TDP of ~45w. This i know from Sandra.

Anyway, if you look back at the Sandy Bridge review, or indeed the FX or Llano reviews, there should be a nice graph that shows power consumption.

IIRC the 2500 and 2600 cpus reach 160W max.
 


Thanks for taking the time to reply. I guess I have some playing around at work today to test some of your theories... Sorry, I can't say anything more than that. 😀
 
[citation][nom]maxinexus[/nom]Intel right now is about 6 to 7 years behind Nvidia and AMD. HD3000 is comparable to Nvidia GT6600 which is like 2004 time. Even if the next generation will be 5 times faster than current one still 5 times poopooo = more poopooI had the "honor" of using HD3000 on i5 2500K when I RMAed my 6950s and oh boy I though I'm gonna have hard attack. I used my old laptop with winXP instead...then I got bored got FX8150 much snappier than i5[/citation]
I'm testing HD3000 right now and it is MUCH better than I expected it to be, and I already had high expectations before I started testing. If you're planning on running Crysis on absolute max at higher than HD resolutions then obviously it's insufficient, but I've been testing games on medium and high at 1080p and it runs really well.

I think Ivy Bridge and Haswell graphics are looking to be very interesting. The current HD3000 graphics are bordering "playable" on modern games, so you only need a small boost to push the FPS into the fully "playable" region.
 

Hmmm...you may have a point...looked it up...scott muller's book (strangely) doesn't have TDP in its index...not even its full form...so i'm forced to quote Wikipedia...
The thermal design power (TDP), sometimes called thermal design point, refers to the maximum amount of power the cooling system in a computer is required to dissipate.For example, a laptop's CPU cooling system may be designed for a 20 watt TDP, which means that it can dissipate up to 20 watts of heat without exceeding the maximum junction temperature for the computer chip.
The TDP is typically not the most power the chip could ever draw, such as by a power virus, but rather the maximum power that it would draw when running "real applications". This ensures the computer will be able to handle essentially all applications without exceeding its thermal envelope, or requiring a cooling system for the maximum theoretical power (which would cost more but in favor of extra headroom for processing power).
In some cases the TDP has been under-estimated such that in real applications (typically strenuous, such as video encoding or games) the CPU has exceeded the TDP. In this case, the CPU will either cause a system failure (a "therm-trip") or throttle its speed down.[1] Most modern CPUs will only cause a therm-trip on a catastrophic cooling failure such as a stuck fan or a loose heatsink.
Honestly don't know what to make of it, seems to be mixing two things...
read more here
 
[citation][nom]XLR[/nom]While everyone else pisses on each other over the value of various processors, I'd like to know why OpenGL support in Haswell is stuck at version 3.2, which was released in 2009.Shouldn't the gpu be up to *at least* OpenGL 4.2?[/citation]

Better to support less features than to not have enough raw performance to use the new features.
 
[citation][nom]frozonic[/nom]i dont know about you guys but intel graphics is very decent for the average user, in fact, the sandy bridge GPU is better than a AMD 5450 or 6450, and i think that this gpu upgrade that will be implemented on ivy bridge is going to be really cool, i think intel should also be making high end and mid range gaming gpu´s. that, will add more competition to the GPU/PC sector and maybe...if done right.... better performance & lower power usage per dollar than AMD and Nvidia. intel FTW![/citation]

Intel's best IGP for Sandy Bridge, HD 3000, is about 25% faster than the 5450... In other words, a little over HALF of a 6450. Get your facts strait. This means that Intel's upcoming HD 4000, if it really is 60% faster than HD 3000, is at best about as good as the Radeon 6450 and HD 3000 is left in the dust of a $20-30 card. I have no doubt that it is significantly faster, 60% is likely, but until I see benchmarks it isn't proven.
Add in something like even a Radeon 6570 or GT 240 and the 6450 is left in the dust... Both cards are about the same price as the 6450 on Newegg while being around twice as fast. That means that dirt cheap video cards still leave Sandy and Ivy bridge graphics behind, even cards that are THAT low end, never mind a mid-range card or Trinity's graphics. Llano A6s and A8s are still faster than HD 4000 and they aren't new.

[citation][nom]fazers_on_stun[/nom]Supposed to be 5X performance increase as well..[/citation]

Five times performance over what? The HD 2000, 3000, 4000, or anything between?

[citation][nom]sonofliberty08[/nom]i bet the intel already reverse engineering what amd & nvidia gpu got for quite some time[/citation]

Intel's graphics is nothing like AMD nor Nvidia graphics.

[citation][nom]mikenygmail[/nom]I see the Intards and Nvidiots, paid by their respective bullying companies, are now all over the internet. First it was the yahoo finance AMD message boards, now they're on google and every major hardware site as well. Consumers are too smart for your nonsense to work, and we will all buy superior AMD products.[/citation]

Intel makes much faster CPUs and Nvidia+AMD's discrete graphics are better than any integrated graphics. Am I an idiot if I buy a CPU such as the i5-2500K for $230 instead of the FX-8150 for $270 when the i5 is far faster for gaming, the whole point of most graphics and CPU oriented articles on this website? Even the top i3s beat AMD's best CPUs in gaming unless you give them huge overclocks that make the AMD CPU use two to four times more power for a slightly better edge.

AMD's graphics is overall better than Nvidia at the same or similar price points, but it's not a huge difference in the high end where Nvidia is focusing at. The GTX 560 Ti is about as good as the Radeon 6950, but according to the latest gaming graphics cards hierarchy from Tom's the GTX 560 Ti uses considerably less power than the 6950. However, for multi GPU setups, the 6950 shows clear advantages, especially since it's capable of three way Crossfire and the 560 Ti can't go past dual SLI.

Overall, AMD is simply better in the because Nvidia is either too power hungry, too expensive, or both. Nvidia is ignoring the low end and leaves it too their older cards and really, their only card I'd consider is the GTX 560 Ti from their current generation. The 560 uses too much power for it's performance and the 570/580s are too expensive for their performance. The weaker 550/550 Ti aren't any better either.
 
[citation][nom]gvbnhyt123[/nom]Wrong again, an A8 APU equipped gaming computer is far more powerful than any Intel CPU + GPU combination. The A8 is also far more efficient.[/citation]

Wrong again, I can get a Pentium G620 and a Radeon 6670 for the same price as an A8 and that will beat the A8 easily. A 5670 may be a little cheaper with the same performance too.

An A8 is almost as fast as the Radeon 6570, not quite even a half-decent gaming card for modern games. I wouldn't use anything less than the 6770 and it's good for 720p max settings gaming, maybe slightly higher or lower resolutions. 6770 is cheap enough to call a low budget card, it's well under $100.

Llano is more powerful than Intel's HD graphics, but not a decent discrete video card. An A6 will beat even HD 4000, but the CPU side would be radically different. Ivy probably has almost 50 or 60% more IPC than Llano and Ivy's versions of the Pentiums will be pretty good. If, like Intel says, they will cost the same as the Sandy CPUs they replace (I'm looking at you, AMD. That is how it should be for your graphics too, or at least close to same prices), then the Pentium + budget discrete video card will continue to best AMD's similarly priced APUs. With Trinity stepping up to A10s, we might need to count a better CPU than the G620 and a faster graphics card too, might need to step up to the 6750.
 
@blazorthon:

^^ this is the real post. i assume you didn't see it because it got downvoted for obvious reasons.
i think you replied to a spammer. it has the exact text from the earlier post, but with an invisible url.
 
Status
Not open for further replies.