Reminder: Don't Forget to Watch the PlayStation Event

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]Jprobes[/nom]Considering that about 98% of the Home PC's in America do not even have those specifications, btw what is the memory bandwidth of a PC? 21-27GB/s max? Not very impressive to this 34 year old.Then again I was referring to game development. Now that a console will be released which closely resembles a PC will help facilitate development.FYI, PS4's memory bandwidth is ~130GB/s and I cant imagine that it's GPU bandwidth would be anything less then what a PCIe 3.0 16x[/citation]

I guarantee that PS4's RAM has far lower memory bandwidth than a high-end computer's graphics memory bandwidth with a card like a Radeon 7870 XT or better.

Also, people, please stop saying DDR5... We all know that if it ends in a 5, it's almost definitely GDDR5, especially since there is no DDR5.
 

kinggremlin

Distinguished
Jul 14, 2009
574
41
19,010
We all know that if it ends in a 5, it's almost definitely GDDR5, especially since there is no DDR5.

Wow, you must be a genius. The presenter said it is was GDDR5 when he used English to announce it, and you somehow indirectly understood the English, even though you still had to speculate it was GDDR5.
 
G

Guest

Guest
[citation][nom]Jprobes[/nom]Considering that about 98% of the Home PC's in America do not even have those specifications, btw what is the memory bandwidth of a PC? 21-27GB/s max? Not very impressive to this 34 year old.Then again I was referring to game development. Now that a console will be released which closely resembles a PC will help facilitate development.FYI, PS4's memory bandwidth is ~130GB/s and I cant imagine that it's GPU bandwidth would be anything less then what a PCIe 3.0 16x[/citation]
thats the point! They talk about this as being the best thing since ice cream but its just a box that wasn't even shown with PC specs from last year.
 
G

Guest

Guest
[citation][nom]otacon72[/nom]Those specs you mentioned are ridiculous for a console because they aren't needed. If you knew what you were talking about you'd know every component in a console is optimized and 4K TVs won't be mainstream for years. Your condescending tone at the end is pathetic also...grow up.[/citation]
Still not impressed and I promise you I'm more grown than you are boy.
 
[citation][nom]kinggremlin[/nom]Wow, you must be a genius. The presenter said it is was GDDR5 when he used English to announce it, and you somehow indirectly understood the English, even though you still had to speculate it was GDDR5.[/citation]

[citation][nom]Jprobes[/nom]PC Gamers should be happy, it is x86 based console with 8gb of DDR5 ram.Hopefully the quality of PC titles will be lifted due to the architecture closely resembling a PC.[/citation]

I obviously was saying what I said in reply to posts like this and if you really want to get into what they said, then I'll remind everyone how dozens of systems lie about what type of memory they have, so it would not even be the first nor the first dozenth time should that be the situation. Furthermore, how would you know if I watched the video before explaining that there's no such thing as DDR5 to the person who said DDR5 anyway? Even if I had, it's not like it would have mattered.
 

kinggremlin

Distinguished
Jul 14, 2009
574
41
19,010
FYI, PS4's memory bandwidth is ~130GB/s and I cant imagine that it's GPU bandwidth would be anything less then what a PCIe 3.0 16x

According to Anand, Sony announced the bandwidth will be about 176GB/s to the CPU and GPU. That is a ridiculous amount that destroys what a PC can do.
 
[citation][nom]kinggremlin[/nom]According to Anand, Sony announced the bandwidth will be about 176GB/s to the CPU and GPU. That is a ridiculous amount that destroys what a PC can do.[/citation]

Radeon 7870 XT is 192GB/s memory bandwidth to the GPU and it's not even close to the only one with that much, let alone being anywhere near the top which is around 300GB/s per GPU for some Tahiti cards and Titan. The CPU's obviously don't have so much, but memory bandwidth isn't a bottle-neck for CPUs in gaming anyway, so it's not like that matters.
 

kinggremlin

Distinguished
Jul 14, 2009
574
41
19,010
[citation][nom]blazorthon[/nom]Radeon 7870 XT is 192GB/s memory bandwidth to the GPU and it's not even close to the only one with that much, let alone being anywhere near the top which is around 300GB/s per GPU for some Tahiti cards and Titan. The CPU's obviously don't have so much, but memory bandwidth isn't a bottle-neck for CPUs in gaming anyway, so it's not like that matters.[/citation]


That's about 4-7x the bandwidth a PC CPU will receive. Look again.
 
[citation][nom]kinggremlin[/nom]That's about 4-7x the bandwidth a PC CPU will receive. Look again.[/citation]

I don't need to look at anything again. Nothing that I said in my comment is wrong and if you think so, then I suggest that you look at it again. I clearly stated that the PC CPUs don't have so much and that the fact that they aren't bottle-necked by their current memory means that not having almost 200GB/s shared with them doesn't matter in the least. That big shared memory interface on the PS4 is undoubtedly almost purely being used by the GPU too.
 

nolarrow

Distinguished
Mar 27, 2011
87
0
18,640
Whatever, all this fighting is pointless. Can we all just get together and lol over how blizzard trolled with a Diablo 3 port? F'ing hilarious.

The killzone game didn't look anything different than what we can do on PC today, i went out for food during the car game demo, came back and saw that real life looking skyrim stuff from capcom and the square demo that I saw on youtube a while back, the cell phone hacker game looked stupid and the graphics were just ok. Can't say I was blown away, and I wanted to be blown away, I'm all about better graphics regardless of platform. If it's good I'll just buy it.

At least that capcom game gives us a glimpse of the quality we might be seeing ported to the PC in the future. I'd say that was the most "next-gen" thing showcased.

The built in streaming could be huge for a lot of gamers, I enjoy a good stream now and again. Easy upload of vids and pics could be cool too for some people.

What is really going to make or break Sony on this one is what they do for their next-gen multiplayer platform or PSN V2.0 - everyone I know who bought a ps3 eventually bought an xbox so they could enjoy a better multiplayer experience. I never owned a PS3 so I can't comment on the quality of PSN but it seems like everyone favored xbox live in this last console war.
 

Bloob

Distinguished
Feb 8, 2012
632
0
18,980
[citation][nom]blazorthon[/nom]Radeon 7870 XT is 192GB/s memory bandwidth to the GPU and it's not even close to the only one with that much, let alone being anywhere near the top which is around 300GB/s per GPU for some Tahiti cards and Titan. The CPU's obviously don't have so much, but memory bandwidth isn't a bottle-neck for CPUs in gaming anyway, so it's not like that matters.[/citation]
Wrong wrong wrong wrong. Even with the best designed games, cache misses occur all the time, slowing the game down precisely because then you have to load stuff from the ever so slow RAM.
 
[citation][nom]Bloob[/nom]Wrong wrong wrong wrong. Even with the best designed games, cache misses occur all the time, slowing the game down precisely because then you have to load stuff from the ever so slow RAM.[/citation]

Ahh yes, that's why every time we test the impact of memory overclocking, it hardly changes gaming performance whatsoever. Surely every memory bottle-necking test done on modern hardware in gaming must be wrong and you must be right.
 

belardo

Splendid
Nov 23, 2008
3,540
2
22,795
Had to go to gamespot to find live video link... (yesterday)

The games looked pretty good. Killzone looked great.

I still see the trend of AAA titles going to consoles with PC getting some ports which will be NO better than the consoles.

Proof? Look at the stagnant PC graphics card biz. For about 5 years, since the ATI HD-3000 series, both ATI/AMD & Nvidia were doing new cards every 3~6 months. Unless you're doing multi-monitor (3/4/6) - any $150~200 card will do.

Maybe... maybe if 26~27 monitors come standard with 2500x1600 type displays for under $400 will we see a real need for more... and 4K monitors are only useful for 50+ inch TV sets, not a desktop display.
 

d_kuhn

Distinguished
Mar 26, 2002
704
0
18,990


I have 27 & 30" 2550 monitors... I could use even more res on both of them (can still make out individual pixels from normal viewing distances if I'm paying attention). 4K may be overkill but not by much. I also think 4k would be close to the resoulution needed to eliminate screendoor without needing AA. It is still annoyingly visible at 2550.

And of course... a 4k display would be REALLY useful when editing 5k video footage and stills.
 

Bloob

Distinguished
Feb 8, 2012
632
0
18,980
[citation][nom]blazorthon[/nom]Ahh yes, that's why every time we test the impact of memory overclocking, it hardly changes gaming performance whatsoever. Surely every memory bottle-necking test done on modern hardware in gaming must be wrong and you must be right.[/citation]
Because the programmers actually know what they are doing, and do what they can to reduce the quantity of those cache misses. This means more memory management, which mean less performance overall. The same code that prevents cache misses, denies the benefits of faster RAM.
 
[citation][nom]Bloob[/nom]Because the programmers actually know what they are doing, and do what they can to reduce the quantity of those cache misses. This means more memory management, which mean less performance overall. The same code that prevents cache misses, denies the benefits of faster RAM.[/citation]

Programmers know what they're doing? Have you seen games and many programs over the last few years? The poor level of optimization with modern software seems to be refute that claim of yours quite well, unless programmers knowingly don't do a good job for some reason. Then we've also got the fact that we've made huge leaps in cache technology lately and much more that most certainly doesn't help your claim. Since it goes in the face of most of what we see, today, I'll have to ask you to provide some very conclusive proof, especially since this new claim seems quite out of sync with our previous discussion.
 
Status
Not open for further replies.