Will there be a 'new' manufacturers race; to 'Fusion'?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


I'll at least give you an 'A' for effort.

'Almost on par' with the 8400M G low-end entry-level graphics? Good one.

100% improvement over GMA x3100? Maybe for the math-challenged in cherry-picked benchies ...
 
Your point? I've demonstrated that X4500HD is much more capable than "fairly underwhelming overall", being able to run F.E.A.R, albeit at low resolutions. What have you demonstrated?
 
^That he doesn't like Intel IGPs like most people?

It seems decent for what it is. A low cost solution that can play HD video and play some mid ranged games (seriously who in their right mind plays high end games on a IGP?) considering that most laptops are a business minded solution.

At the very least a laptop is used for mobile Internet/entertainment. Mainly used for school and work.
 



First, the material WAS NOT Styrofoam. Should I send you the thicker stuff that surrounded my laptop? You are still stuck on an admittedly incorrect definition.

Oh and it was DUAL core for dummies. AMDs quad is native. Fusion is a dual core CPU with a PCIe tunnel and No Bridge on a package with a GPU.
 



Fab 36 uses SOI wafers. TSMC uses SiGe wafers. In order to make Radeon, they need to either bring in the wafers or qualify production on SOI. It's being reported that ATi is going to 40nm with TSMC. 45nm is supposedly on optical shrink while 40nm is a physical shrink. Perhaps they can assemble it at one of their assembly plants but....they're still better than crayons.
 



Like you say did you actually read that? Q3 Arena? FEAR at 640x480; 7FPS at 1024? Doom3 UNPLAYABLE AT ANY RESOLUTION?

Wow, maybe I should get one of those Darth Vader

FAIL

pictures.
 
So now we're supposed to play graphic intensive games on IGPs? Please take your anti-company sentiment elsewhere.

fail.jpg
 


O Rly? Shall I necro thread a little. As always, you have a short memory. While those threads were 'lost' (as were so many others during the last site update) I just happen to know where there are copies stored.

The material WAS styrofoam, it was incorrect from every aspect from the initial post to the last dying gasp for legitimacy, and it was only one of many epic threads which firmly removed any doubt about taking you seriously.

It was most certainly was not "DAUL" cores for dummies:

http://images.google.com/imgres?img...uad+core+for+dummies&um=1&hl=en&safe=off&sa=G

amd_mc_processing.jpg


While the name was Multicore, AMDs intent was to belittle the MCM. As intel did not have a dual core MCM, but a quad core MCM based on 2 native dual cores, the implication was clearly, beyond a shadow of a doubt, uncontestably "quad" cores, NOT dual.
 


Intel IGPs can handle solitaire...

With an overclock they might be able to take on minesweeper.




But anything above that and you are running at minimal settings. No one can deny intel IGPs are crap, absolute utter crap, and they are creamed by both Nvidia and AMD equivalent offerings.
 


But the Intel GMA x4500 can do it with DirectX 10.0 and Shader Model 4.0 :ouch:
 


Untrue. Yeah, I don't think they are great in any way, but I can play Warcraft 3 at 1280x1024 - everything on HIGH - with: Pentium D 820 2.8 GHZ, 2 GB DDR2 667, Intel 945G. : P The average FPS is something around 29.2.
 


Isn't AMD going to go MCM with their 12 core? Will be interesting really. Talking trash about something, that works and beats your offering then using the same thing later on. Ahh I love it.

I still think its funny. Gaming on a laptop has never truly been gaming. Heck back when the X800 and GF6 were out for the mobile it was still only able to play Doom 3 at 800x600 with playable framerates. While I am sure that has improved I am still going to doubt it will play current gen games decently.

Meh. I still want to see Larrabee and the resulting IGPs performance in games but waiting is hard. And Baron using Intels IGPs as a reason it will suck is hilarious.

Let the good times roll.
 


You might not thank me for saying this... 😀

But that is a 6 year old game!



ep2.gif


I have to say I do have some doubts your hitting near 30 fps!!! (at least consistently)


Fella I used to live with had a P4 with 945G, and the damn thing would chug when faced with C&C Generals and alot of units on the screen (not that I minded too much - usually mean I'd beat him in the game!) Eventually got a radeon 9800 on the cheap and that did the trick.
 


Hahaha! I thank you for saying this - especially since you have gone through the trouble of posting the graph! : P

The oldies rock my world! ;D

Anyway, I really hit near 30 FPS most of the time, however, as you stated, I have some problems when there are 20-30 units on the screen. Sometimes it goes down as much as 12 FPS - for 2 or 3 seconds - or goes up as much as 38-41. Probably the game is still more CPU dependant at this resolution and I know it was optimized to work better with multi-core CPUs in the latest patch. I tried using a GeForce 7200GS (IGP in disguise, I know : P) and it didn't improve anything.

Anyway, I would love to test it with the upcoming 790GX. =D
 
^I for one can believe he can hit 30FPS in Warcraft 3. HL2 EP 2 has a lot more advanced features and to tell you the truth Amiga, Source is very CPU based.

BTW what CPUs were used for the AMD and for the Intel setup? Because at 800x600 the CPU is a big limiter and HL2 EP2 has a lot of enhancements to the Physics/Particle engines....
 


I would think 45-40nm with TSMC would be the point where they could jump to SOI on the graphics side.

Theo may have nailed this one ...
AMD Outsources To TSMC For CPUs
http://www.tomshardware.com/news/cpu-phenom-amd,5370.html
May 14, 2008

Be interesting to see how AMD enforces 'proprietary' tech with TSMC.
 


Again, biased opinions based on no actual facts.
 

Anand good enough for you?
http://www.anandtech.com/cpuchipsets/intel/showdoc.aspx?i=3356&p=7

we can only conclude about Centrino 2 what we know on paper. It shouldn't really be any faster, clock-for-clock, than the Santa Rosa Refresh based Centrino notebooks. Initial results we've seen from OEMs that have gotten systems to work shows that performance and battery life of their new Centrino 2 systems aren't any different than their previous Santa Rosa Refresh systems. Compared to earlier Santa Rosa and Napa machines, the upgrade should be worth it, but if you just bought a notebook - don't be fooled by the 2, it's not time to upgrade.

what we need from Intel or a capable OEM to truly determine the worth of Centrino 2:

- A fully working, fully optimized Centrino 2 notebook
- A similarly configured Santa Rosa Refresh notebook for comparison
- The ability to switch between WiFi Link 5300 and 5100 cards to truly determine their tangible value
- A Centrino 2 system with discrete graphics to truly evaluate how the switch between IGP and discrete graphics works
- Working GM45 drivers with full video decode support and proper application support for it as well. Many of these notebooks will be shipping with Blu-ray drives and in the interest of actually being able to watch a Blu-ray movie on a battery, hardware decode acceleration needs to work.
 
How does what you posted prove that GM45 is lacking high definition content playback capability? How does that prove X4500HD is lacking in performance?

I posted earlier, with real benchmarks, from notebookcheck:
http://www.notebookcheck.net/Intel-Graphics-Media-Accelerator-4500MHD-GMA-X4500MHD.9883.0.html

Not only X4500HD outperformed the entry level discrete graphic cards from last year, its almost on par with Nvidia's last generation mid-level discrete graphic card (GeForce Go 7400).

Is it strong by any means? Absolutely not. Is it competitive against AMD's 780G? Not at all. What X4500HD signify, however, is that Intel is already making significant improvement on its IGP technology.

On the other hand, ATi's IGP doesn't seem to improve that much at all, if 3DMark06 is of any indication. The difference between 690G (HD2400 core) and 780G (HD3200) is a measly 300 points.