Intel Pentium G3258 CPU Review: Haswell, Unlocked, For $75

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


When your vid card is the limiter they all look the same or pretty close to the same. I prefer seeing at least two selections for each game, one showing what normal people run at (largely 1080p) and another that is showing settings much lower to stress the cpus and show how they'd separate if you do indeed buy a card one day that makes the cpus the problem instead of the gpu. This way you get to see how your next gpu purchase will affect the cpus they're testing. A maxwell purchase would likely cause the exact settings they used here to be more cpu limited as it SHOULD be much more powerful at 20nm (AMD too). If all the benchmarks look the same for cpus because you keep tapping out the gpu in the tests nobody knows what happens when the shoe is on the other foot, which will surely happen at 14nm if not 20nm. OF course if there are time constraints forcing them to reduce tests, you can go elsewhere and find that data as you already did apparently. Someone always tests the other side of the story, no matter which you're looking at here or another site etc.
 

Rakeen70210

Reputable
Jun 9, 2014
73
0
4,660
What I would like to know is how much can you overclock this chip with stock fan using the ASRock motherboard($40). I mean, if I want to be able to go as cheap as possible but still get results then I think its worth it for them to test the stock fan and a cheap mobo no? If at least 4ghz can be reached using this then I would love to buy this, but if I am required to buy a cooler and a better motherboard then there is no point.
 

vertexx

Honorable
Apr 2, 2013
747
1
11,060


Spend the $115 and let us know!
 

logainofhades

Titan
Moderator


I would probably get a Hyper TX3 at the very least for overclocking.
 

justchuck69

Distinguished
Jan 21, 2009
60
0
18,630
So what you are saying here is that for game play why buy a $240 i5 when a $75 Pentium will beat it most of the time ..... unless you do a lot of work with your computer then spending the extra might help ( but the an extra 100 for an i7 would be better ) !
 
Ok, this raises an interesting point of comparison. Given the choice of buying a cheap H81 mobo and this Pentium, OR a cheap AM1 board and quad-core Kabini, never mind performance (which the Pentium would obviously win), which would be the more interesting (and perhaps useful; not just for gaming) parts for experiments? Or, if you were going to mess with one or the other, which would you choose, and why?

Hmmm, I may start a thread...

Edit: http://www.tomshardware.com/forum/id-2186291/mess.html
 

Rakeen70210

Reputable
Jun 9, 2014
73
0
4,660
Yeah see if I wanted to get a cooler minimum $20 would need to be spent. I could get better RAM, better PSU, better GPU, a keyboard, anything else with that money. Hell even a better CPU would be possible(fx-6300 is $99 at microcenter WITH FREE MOBO). Well I guess we just have to wait a few more days until everyone starts mass benchmarking the stuffing outta this pentium to find out what it can really do.
 

logainofhades

Titan
Moderator
I am considering the Pentium and H81M-DGS R2.0 for cheap WoW rigs. Might do better than my FX 8320's given how Blizzard hates AMD. Not to mention, a lot less heat to deal with for summer. I could always just keep the FX's and swap back for winter. :lol:
 

vertexx

Honorable
Apr 2, 2013
747
1
11,060
A quick power/physics discussion - the last chart "Efficiency" is confusing. "Efficiency" should always be a "more is better" chart. You are actually charting Energy Consumption. Your "Power Consumption" chart should just be labeled "Average Power". You don't consume Power, you consume Energy. Power is simply the rate of consuming energy. So here is how I would rename these charts:

1. "Average Power Consumption" should just be "Average Power"
2. "Completion Time" is fine.
3. "Efficiency" is really "Total Energy Consumption" - in this case, lower is better. Units should be "Watt-hours" or other standard unit of energy.

If you really want an "Efficiency" chart, then you need to normalize it, which is really impossible in this scenario. Efficiency measurements are always unit-less, and you obtain it by dividing some quantity output by quantity input. In this case, the output is the benchmark, which is impossible to quantify in terms of an energy number. So really you can't do "Efficiency" here, unless you wanted to set a baseline standard that you measure all other results against. Hence, my suggestion to change that to "Total Energy Consumption" for the benchmark test, which is a "less is better" measure.

 

mapesdhs

Distinguished


From what I've seen with AE, I expect it depends on the nature of the work one is doing. The lesser cache
could hurt in some cases, but a more severe issue might be those cases where a particular plugin one is
using can't be accelerated with CUDA. When a friend of mine was trying various things out in AE, for
example, he found that the Shatter plugin really slowed things down, whereas an earlier section of the
animation (lots of raytracing) rendered very quickly with three 580s. Turned out the Shatter plugin code is
very old, doesn't get boosted by CUDA, and only runs on one CPU core.

Other than that, one might feel a degree of sluggishness if the system is having to cope with other tasks
at the same time, whether that's I/O loading from the main app being used, or some other task such as
a background virus check. Best to run such systems standalone so one doesn't have to install any
security stuff. The slower RAM support could be a limiting factor aswell.

Thus, using CUDA could I suppose make it a reasonable entry/budget system, but I reckon one would
quickly tire of the various likely limiting factors. Too many aspects of using a pro app don't benefit from
CUDA, eg. the main interface.

I've not yet tested AE CPU issues with a dual-core, but today I did compare to a stock-speed P55
system with an i7 875K (3.2GHz) an 16GB RAM @ 1333, using four GTX 580s for CUDA. For the simple
Blender/BMW and Creativecow/AE CUDA tests the difference compared to a 3930K/4.7 system with
64GB/2133 RAM was minimal, but for a much more intensive AE test (a Titan would do it in about 25
or 30 mins) the difference was a lot more significant, about 14% slower. I'll test with an i3 550 later.

Ian.

PS. Alas I can't comment specifically re Premiere/Photoshop (don't know much about them), but feel
free to PM me to discuss further.

 


Yeah the thing is, the "free" board is complete junk. Even the +40 $ board is complete junk. So no...

http://www.microcenter.com/product/366104/GA-78LMT-S2P_Socket_AM3_760G_mATX_AMD_Motherboard
 

usbgtx550

Distinguished
May 24, 2011
372
0
18,810
I would like a comparison of more amd processors, or at least a stock 6300. For the price of z97 and heatsink, you can probably get a fx 6300 with a 970 chip-set motherboard.
 


I am well aware. But I wouldn't buy a 760G chipset board even if I was paid to.
 

Rakeen70210

Reputable
Jun 9, 2014
73
0
4,660


I realize that the mobo is complete trash and that is why its free, but the point I was making was that the 6300 can be quite cheap if you get some extra money with you. The motherboard is just a plus in which I can just use it for another extremely cheap rig or to use as a backup. After all, if you have only a $300 budget every dollar counts right?
 
i'm not so sure.

I suspect the lack of L3 cache (as worthless as AMD's is) on the athlon was masking the likely issues stemming from the tiny cache on the pentium.

When you look at an fx4300 in benches against a 750k, you'll notice it pretty consistently beats down it's slightly gimped cousin. The fx4300 has been overpriced for a while, i do expect it to come down to athlon levels... but there really isn't anywhere for the fx6300 to go from here... it's already a killer deal at the $105 it typically sells for.

I'd probably take the $$ saved going am3+, grab a solid budget overclocking board like the m5a97 r2.0 for $80 and put the extra cash into an actual budget gaming chip like the fx6300. they're not as good as an i5, but i'd take one over an i3 any day of the week, and they're not that far back from the i5.

This is a cute chip, but i sorta question the market. unless intel lets them be overclocked on cheaper motherboards the total system cost will be about the same as an fx6300 gaming system... and i'd probably take the fx6300 over this little guy every time. Not to say it's not a great chip in comparison to the athlon II, but then i don't think i've ever suggested an fm2/fm2+ system for a gaming build anyway.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


FX8350 is $170 at newegg.
i3-4330 is $140. which has hyperthreading and now comes in i3-4360 for $160 at newegg!
http://www.xbitlabs.com/articles/cpu/display/core-i3-4340-4330-4130_7.html
It wins most of the games vs 8350 (8320 would be slower still) and in a SFF or HTPC it uses 83w vs 210 which can make a ton of difference.
i3-4330 Wins:
Batman Arkham, Civ 5, F1 2013, sleeping dogs
Loses metro/hitman but not by much in 1080p which I consider most important for these guys.
Won photoshop cc also. Vs the fx6350 the 4360 amd would also lose in hitman, since 4330 is already right on it's butt. Metro looks like it would still show the same and AMD wins.

Granted more cores are better for most apps but as a gamer it's a pretty clean sweep with 5w only 2L to 8350. Also as noted the watts can't be overlooked for some situations. They also come with a gpu for poor people that can't afford to put one with an 8350 etc. Just like the maxwell 750, there are some very good use cases for these chips.

 

logainofhades

Titan
Moderator


Asrock H81M-DGS R2.0 and H97 pro4 both have overclocking options and Asus recently announced it was going to be added to some non Z boards.
 
Status
Not open for further replies.