Is the AMD FX 8350 good for gaming

Page 17 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


Somebody who doesn't feel the need to hide behind a false name.



Agreed. You could make an argument for Steam's data being representative of PC gamers in general (not sure that I would argue even that though).



People are more likely to switch from pirated XP to pirated Windows 7. You think the average Chinese citizen idolizes and emulates their government and chooses their OS based on what the government uses? If they chose XP over Linux, why would they not choose Windows 7 over Linux?
 


Sure, everybody's doing that. You're clutching at straws now. And I linked Statcounter for the purpose of the Linux stats. It's beyond my control that they offer stats on other things too - doesn't mean I was pointing at those other things. It's primarily browser stats - you're not going to drag browser usage into this as well are you?



Statcounter is free. Her argument there is incredibly weak. I'd be very interested in hearing a genuine argument against the validity of Statcounter for usage share of operating systems. 3,000,000+ global sites across a broad range of interests, providing analytics absolutely free of charge.

If anything, I'd argue that a free, opt-in analytics service would over-represent Linux. How many Linux-oriented sites are going to be interested in contributing Linux-friendly numbers to the stats free of charge? Now how many Windows-oriented sites are going to be worried about their representation in usage stats? A level playing field is the absolute least Linux can expect from Statcounter.

Like I say, I'm very interested in any intelligent argument/point against the validity of Statcounter's numbers.
 


Except that windows 7 is harder to pirate, and if they were going to do that, they would have already begun switching over...

MS has stated they have built in much tougher piracy coding making it harder to crack.
 


I don't know much about pirating it (makes sense that newer versions would be more effort), though my best friends are all running pirated copies of Windows 7 Ultimate. It was just me that paid for it, so it can't be that difficult. Piracy and attempts at piracy prevention have been around for a very long time and the corporations still don't have a solution.

Possibly the slower adoption rate is a result of the use of older hardware that would struggle with 7. Of course that could be a solid argument for using a lightweight Linux distribution.
 


In the first place Statcounter is a commercial site. They don't live from air. In the second place, they have both free and pay statistic services. In the third place, I already gave you web statistics from an .org site that shows a discrepancy of the 100% with statcounter statistics. In reality the error is greater than 100%, but I don't need to go into details now. In the fourth place, I already explained to you how webstats overemphasize North-American market and underestimate the rest of the world. In the fifth place, it seems to me that StatCounter counts hits but does not track individual visitors. In the sixth place, many windows hits come from IExplorer, whereas the immense majority of linux distros use FF or Chrome, which means that a percentage of linux users are excluding themselves from statistics like StatCounter by blocking them via privacy settings. As stated before by 8350rocks many schools, libraries, and other organizations are using linux and those schools, libraries, and organizations block analytics and tracking services to prevent collection of data about minors. Finally, statcounter only covers a 2--3% of worldwide traffic.

Windows-oriented sites live from Windows being mainstream. As many analysts have noticed, if linux share is so low as some pretend, why is Microsoft so worried about it? Why they tried very hard to block dual boots with windows 8? This is the same question one ask about Intel vs AMD. If Intel chips are so good as some pretend why does Intel spend millions on developing benchmarks favouring them. Why so many review sites are biased? I have just discovered an AMD APU vs Haswell review where the reviewers run the AMD chip with underclocked memory below the stock speed (albeit they had a 2133 kit at hand) and used the last driver for Intel whereas an old and beta driver for AMD... If Haswell is so good they would not need to use such tricks to inflate Haswell numbers.
 


W3counter tracks differently to Statcounter (counts unique visits rather than page visits) so ALL their numbers are different - their browser stats are also totally different. Naturally you'll choose to believe them over Statcounter because you prefer what W3counter is saying. Even so, 2% is a lot less than the 5-10% you claim.

The paid services Statcounter offer are specialised reports - on the analytics (stat-generating) side, it's free. So that's a weak argument. As for over-emphasising North America, Statcounter don't apply weighting to their measures like other stats services do (if you bothered to read their FAQ you'd know this). So that's an invalid argument. Internet Explorer also offers Do Not Track (and alternative approaches have been available for a long time for users that care enough to make the effort).

2-3% sounds small, but how would you need more than 3,000,000+ sites and billions of hits counted? Are you seriously arguing that's an insufficient sample size and that 5,000,000 or 20,000,000 would paint a totally different picture? You're saying these 3,000,000 sites are all Windows-oriented sites?

Last year, Microsoft had a revenue of almost $73 billion. If they lose even 1% of that, that's $730 million. Even for a giant like Microsoft, that's enough money to be worth chasing and enough reason to 'worry' about Linux.
 
Wow...10% difference between the $1000 i7-3960x and the $180 FX8350. Man, that might be the best $820 I ever saved myself in my lifetime.

:)

Once again, a gaming benchmarks where ALL RESULTS fall within 10% MoE. Surprise, surprise...
 


Therefore the FX-8350 is only 4 FPs behind the i5 and the i7 albeit in that review the Intel chips were given some minor advantage such as RAM speed and W7 without the patches correcting the scheduler for the FX chips.

Moreover, they used a beta driver for the Radeon in the comparison with Nvidia GPUs.
 


That is as saying that naturally you'll choose to believe them over W3counter because you prefer what Statcounter is saying.

I think I already explained to you that the 5-10% is not estimated using net stats.



Weighting was not even mentioned in my post because is not what was being emphasized here.



IE may offer that, but FF users are more aware of this kind of privacy stuff than IE users.



Sorry but a 1% is a 1% here and in Venus. Nobody spend much attention to a 1% of market but uses all the resources (managing, strategy, research) in the 99%.
 
I strongly doubt that. Microsoft 'FUD' started back in the 90s when Linux was in its infancy. Does that mean it had significant market share even then to cause Microsoft to worry? You're boring me now anyway - there's no point in arguing with somebody who refuses to employ the use of common sense.

All you have to do is look around at the world around you to see what people are using. Most people I know are barely even aware of what Linux is. Naturally you'll now tell me that most of your friends run it. I could then argue about you socialising with people with common interests, but it'll go in circles.

So go ahead and continue to pretend there's this mass exodus from Windows to Linux. I'll continue to live in the real world.
 


There certainly is no "mass exodus" from Windows to Linux but it definitely is becoming more widely used and also expanding into uses for casual users as well. The average pc user doesn't know Linux, of course, they are over marketed to by Apple and Windows casting a shadow on Linux and it's uses.
 
well given that AMD has taken over the next gen consoles, doesn't it seem more reasonable to depend on AMD for future games? it is true that 4 cores is overkill for most of current games, but for a person to build a new gaming PC these days (like myself) AMD sounds more promising, what do you think?

i've seen the haswell i5 CPUs, and a liked them, but i have no idea what AMD has in stock to respond. currently i wish AMD releases a better FX3850-like CPU; i mean, even for three cores only, what is this 4.0GHz base core frequency doing to be almost on par with the i5-3570 with 3.4GHz or so...

any advice??
 


i may partially agree with you on that, but i think it is unfair to compare a console to a PC, consoles are built for one purpose (plus the unified memory architecture or something). OTOH, intel-based console would be much more expensive, right?

my idea is, and i may be wrong, developers will eventually try to unify their games' development between the PCs and consoles. they may not, but if they did then they WILL be limited by consoles' raw performance. and this is where the "more promising" came from.
am not a expert or something and surely many will correct me as i hope, but i don't know how will developers handle the next gen games, and surely no gamer will pay more money for unnoticeable performance gain.

am very concerned about the future since am building my first gaming rig.
 


One thing is when Microsoft in the past was watching linux evolution, and another is recent worries and direct attacks (such as their attempt to block dual installs in W8 machines). As several experts have noted Microsoft would not be worried if linux had only a 1% share. But they are specially after the W8 fiasco.

Your socialising argument is plain wrong. I find linux in school where I know nobody. I find linux installed in cibercafes where I know nobody. Machines with linux pre-installed are sold at gig supermarkets...

Finally, I did not say that there exists "a mass exodus from Windows to Linux", this was said by someone in that imaginary world you mention.




Most current games use 2--4 threads and thus ignore the 25--50% of an eight-core chip. Next gen games will be optimized for 6 or more threads because consoles are eight-cores. Quad cores will be able to play next gen games during some time, maybe one year or so, but will become definitively outdated.

All triple-A game developers recommend the FX-8350 chip as the best cpu for future gaming.



PS4 CPU: 102.4 GFLOP
i7-3770k CPU: 112 GFLOP
 


:lol: Most benchmarks, when showing the setup, show that the testers use the memory at 1600Mhz as opposed to the 1866Mhz that the memory should run at default settings. I don't know where you got your benchmarks from, But even Toms did this. Fyi, It's cheating for intel because intel has a better memory management system than AMD, AMD has 1866 as a default for good reasons, AMD can keep pace with an Intel fairly easily.
 


Actually my HD 7870 XT (Tahiti) does just fine at the moment, however, if volcanic islands comes out this year with a HD 9870 or something similar...I might just have to upgrade. Simply because the price point would be attractive and the technology should be dramatically more potent.
 


AMD is going to be a much more effective gaming rig in the near future.
 


For apus faster RAM makes sense if you are using the igp. For the FX series the faster RAM shows no difference in performance over margin of error (2%). Either way, to properly benchmark RAM needs to be tested at the same speeds for both platforms. Most people buy 1600 mhz RAM anyway.
 




The 2% is untrue.

For benchmarking different RAM modules on the same chip one uses the same speed, but for benchmarking memory bandwidth speed on different chips with different architectures one uses the stock speed for each chip.

I know lots of FX-8350 owners who use 1866 or 2133 RAM, specially because they OC. Moreover, if popularity is a concern why are sites reviewing the i7 extreme chips or cards like the GTX Titan? According to Steam most people has HD 3000, would we be testing only that or similar performance card? No?
 


You fail to get the point. Moreover, in my country and in other countries the difference between 1600 and 1866 RAM is about $7, and 2133 is only a bit more expensive.
 


There is a great deal of truth in this, even now in the states, there's only about $10-15 difference or so between same series 1600 MHz and 1866 MHz RAM. I wouldn't have bought the 1866 RAM I bought if it wasn't such a minimal difference.
 
I'm not sure why people keep arguing the fact that "future" games will support more cores. Future proofing is moot in the tech world. Maybe immediate future proofing.

I don't think most games will use many cores efficently until the middle of the 8th gen console era. Right now single-threaded performance is key for now. By the time more games support more cores, your current cpu is already obselete.

Especially considering are architecture changes so rapidly.
 


+1
 
Status
Not open for further replies.