AMD CPU speculation... and expert conjecture

Page 35 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

jdwii

Splendid



Its all a bit silly honestly i think IGN is a little more accurate what's the point in putting an 8 core into a gaming console, and at that a weak Amd one?

Also whats the point on going x86 did Amd have them do that for part of a deal(maybe some special price) or something maybe to make ports to the PC more easier?

Also by having the PS4 and xbox 720 being x86 does that make it harder to port a game to the Wii U(IBM CPU)?



If anyone even bothered to read what i said what do you guys think should be in the consoles(in all seriousness)?

To keep consoles around 400$ what is the smartest choice of hardware to use?

Also every single thing i said is a question :kaola:
 
At least a gpu of 7850 or better ;)
An 8 core would be nice, as we all know, the cell wasnt that dev friendly at first.
There would be no transistions from one box or PC to another. all being x86.

Im given to understand, its so easy to outdo perf on a console today compared to what we currently have, since its all dedicated, everything Ive heard and read seems plausible for very nice improvements
 
too lazy and too ignorant about consoles... so don't quote me on the replies, k? :ange:

please forgive my lack of in-depth knowledge. :sweat:
 
Remember the old PCI-e specs?
Back then, you could use top end gpu, have lower total TDP, today, not so much.
Power gating, smt cmt all this isnew as well.

Add in the current suggested nodes, the better compute etc etc, they wont need to be top tier
 

jdwii

Splendid
ha ha thanks for the replies

If it was me i'd do something different(4 core A10 and a mid-range GPU could even do CF and this would be rather cheap and i'm with gamer i don't think games will ever see the light of day past 3-4 cores) i found this article to be a good read

http://www.gamerevolution.com/manifesto/the-wii-u-hardware-is-fine-graphics-are-costly-and-overrated-16197


At this point i really don't know i mean 4K TV's are coming and 3D at 1080P seems to be well kinda gimmicky. Honestly kinda want Gamerk316 to state his opinion on this(what hardware should they use when trying to keep the price around 400$) since he has decent knowledge in this area.


Also side note
Nintendo fans are so different from Sony and Microsoft fans i remember waiting in line for a Wii overnight ha ha the whole crowd of people were just so nice and asking everyone what they wanted and making fun of people who got a PS3 just to sell it online for more money when people getting the Wii were actually getting it to play it.


(my gaming history down below, so different then other here i'm sure)
I remember all the PS2+Xbox fans back when i was in middle school always talking about GTA or other M rated games and i was like super smash bro's or Luigi mansion. Unfortunately though the wii to me was kinda a disappointment with only C rated titles most of the time, i got tired of it and sold it to my mom(lol still works fine to never broke once) and got a 360 around 2008 and it broke in 6 months the DVD-drive gave out(we all know how hard those things are to make the disk flew across the room) so i went to talk to Microsoft and i said its been over 3 years and you guys still don't have this worked out i sold my wii for this POS. So after that i sold the 360 to a friend when it was fixed and around christmas got my self a nice athlon II x4 620 for 100$ and a 9500GT ddr3 version for 70$ along with a Asus cheap board i had my self a low-level gaming rig(had all the other spare parts from my POS Compaq that HP had to fix 3 times and every time it was worse then when i sent it in) that honestly made games look better then even my 360. So with that horrible experience never again Microsoft(and Compaq as well) and their is no excuse for why a 150+ billion dollar company with a huge name took so long to fix their crap(the disk issue was common and of course we all knew about heating issues as well as external PSU issues).


 


Well, if you wanted to dedicate 1-2 cores for JUST the OS running in the background, that leaves you with 6, the same amount as the PS3.

Also whats the point on going x86 did Amd have them do that for part of a deal(maybe some special price) or something maybe to make ports to the PC more easier?

For one, easier dev attention. Secondly, you get more performance per clock, so you can lower clocks to reduce thermals. [Pro tip: X86 is about 2x the IPC of PPC these days, so 1.6GHz x86 ~ 3.2GHz PPC]

Also by having the PS4 and xbox 720 being x86 does that make it harder to port a game to the Wii U(IBM CPU)?

Yep. Nevermind Nintendo doesn't have a lot of devs to begin with. If the PS4/720 ARE using x86, then the Wii U is basically going to be abandoned by most developers.

If anyone even bothered to read what i said what do you guys think should be in the consoles(in all seriousness)?

Depends on how you view a console; they are moving in the direction of dedicated gaming PC's, so you need PC-like specs. But at the same time, you need a good enough API so that 5 years down the line, a 7670 isn't holding the console back. Without a strong API, it doesn't matter what you put in the thing.

The move to x86, if true, basically highlights the fact that dedicated consoles are on the way out. You essentially now have unupgradable mini-PC's.
 


I hope MSFT eventually starts adding dedicated Ray Tracing functions into the DX API; we're starting to near the point where consumer hardware can handle it. I figure about 5 years or so...
 

truegenius

Distinguished
BANNED
xbox 360
http://game-consoles.venturebeat.com/l/7/Microsoft-Xbox-360-S
512 MB GDDR3 @ 700 MHz shared between CPU & GPU
128 bit channel
xbox720
http://vr-zone.com/articles/rumored-xbox-720-and-playstation-4-specs-hit-the-web/18741.html
Microsoft's console will come with 8GB of GDDR5 RAM, a healthy portion compared to the PS4’s allotted RAM serving
:whistle: well i never ever saw consoles like xbox or ps but still
8GB ddr5 ram from 512MB ddr3 700mhz :eek:
am i smoking too much :heink: ?!?!
if not then give me some salt
actually give me a heap of salt and pounds of medicine to digest this :p
 

mayankleoboy1

Distinguished
Aug 11, 2010
2,497
0
19,810



"we're starting to" and "5 years" are contradictory, IMO.

The question i ask to myself is "what can Caustic graphic do to make real time Ray tracing a reality in gaming ? ".
 


GDDR5, which has been around a good two years or so now. And as I doubt the GPU needs that much, I suspect the main memory and GPU VRAM are getting added together in those numbers.
 


What I mean, is that while you won't see mainline games using Ray Tracing anytime soon, you are seeing them starting to develop the tech, hardware, and so forth to move it forward.
 
ID Tech 4 was promoted to make "good use" of Ray Tracing IIRC.

I remember the lightning effects in DOOM 3 and they were awesome. Killed everything at the time as well, hahaha.

Also, we need the "salt" image we all like so much.

Cheers! :p
 

Cazalan

Distinguished
Sep 4, 2011
2,672
0
20,810



In retrospect not that difficult a win as Nvidia was already designing their own gaming console.
 


Remember that a Ray Tracing algorithm gets insanely complicated as you add light sources. Hence why it takes so bloody long. A Ray Casting algorithm, however, remains at about the same speed, regardless of the number of sources.

The difference is simple really: The Ray Tracing algorithm goes from every light source, and traces the rays outward. Ray Casting does the opposite, starting at the users eyes and going out back to the light source. The second is MUCH faster, as the majority of light doesn't reach the users eyes, and is thus a waste to process, but you lose some accuracy as a result.

In short: the first "Ray Tracing" games will be "Ray Casting" engines. Still better then what we have though.
 

how many flops will this take?
 

truegenius

Distinguished
BANNED
dunno if already discussed :??: http://www.mobile01.com/topicdetail.php?f=296&t=3060064&m=f&p=1&img=0
some news/thing/discusion/rumor ( in chinese or japanese :/ (i dunno))
about am3+ 10xx chipsets
apu chipsets
apus
A10-6800K 100W 4Core Radeon™ HD 8670D
A10-6700 65W 4Core Radeon™ HD 8670D
A8-6600K 100W 4Core Radeon™ HD 8570D
A8-6500 65W 4Core Radeon™ HD 8570D
A6-6400K 65W 2Core Radeon™ HD 8470D
A4-6300 65W 2Core Radeon™ HD 8370D
Graphics
介於 HD 7750 ~ 7770 之間
跟 FM1 一樣
短命的 FM2
Fusion Controller
AMD A88X
AMD A78
AMD A68
取代目前的
AMD A75
AMD A55
以上於 2013上市
PS:
至於 AM3+
沒FX新U
也沒新Controller
等很久的 1090、1070、SB1050、SB1060

http://northwood.blog60.fc2.com/
it have some hints about richland and links for ivy celoron
 


Depends on how well the algorithm is coded and optimized. You can cut a lot of corners and still get a pretty good algorithm (using fewer rays, or not tracing back as far, at the risk of missing some reflection angles).
 
Status
Not open for further replies.