AMD or Intel?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


please tell me your budget, as that would end this war once and for all.

There is a reason intel is in 90% high-end workstations, servers and much more.
 


He's probably trying to save some money, in that case is better if he just goes for the FX-6300.. plus, with a GTX610 the 8320 will bottleneck without a doubt

 


no swearing necessary.

highest load temps i see on stock clocks is 30C Idle 15C

all depends on the cooler, Hyper 212 Evo will do grand, i use NZXT Kraken C60.

nVidia is out of the question for GPU, and for animation under a budget a Firepro is perfect, though any low cost GPU will do, Rendering will be faster with a Firepro.

you can overclock without altering the voltages easily on the 8320, i found that reaching 1.5vdc begins to get hot 40C to 60C depends a lot on environment as well.

There is no need to spread the anger of having bought an Intel when the identical is cheaper via AMD, have you considered APU? Those mentioning the Intel HD IGP should also consider the possibility of an APU to be fair, i have not used an APU but i hear they are superior in many ways, i mean honestly, how many Intel HD discrete cards have you seen paired with the Intel HD IGP for a Hybrid.... what does Intel call it? it can't be SLI or CrossfireX cuz those are terms copyrighted to nVidia and AMD. Future expansion and overall performance i would have to hand to AMD, with the APU you have the option of adding a discrete GPU of same family to your build for a Hybrid CrossfireX significantly bumping up your performance.

The fanboys of Intel will dilute you with their hype, and haste, though, being a user of both manufacturers i have seen the performance difference, yes Intel excels great, but degrades fast, and ends up taking more to maintain later on, i will admit that Intel was once on top of their game, but their recent corner cutting with chip quality leaves me wondering what they have going on behind closed doors, and with their outrageous price for second hand quality...
I make my point with their new chips, the thermal bond between heat spreader and chip with new CPUs produced in 2013 were NOT epoxied with a thermal epoxy, but instead a silicon based grease which can be purchased from Radioshack for less than 5 bucks, that is a severe quality drop, however as temps may appear to be cooler at first, let it dry out and it will fry. it takes about 2 years for Silicon based thermal grease to dry out, those of you who puchased their Dorito last year, it won't last to 2016, where as my AMD will power on through, granted, everyone will have 20 core CPUs by then, and AMD will be on top for inventing it WITHOUT using HyperThreading like Intel does.

Intel Eight core? no, just a hyperthreaded quad, nothing special, thats like dressing up a 4 banger Civic engine with bolt on V8 looking panels.

AMD Eight core? oh, yeah that is actually eight cores, four pairs of cores, that is a V8, technically a V8 engine is 4 V-Twins linked together. Looks like AMD took notes on how to make eight cores.

and before you fanboys get your pants in a bunch, yes i'm sure Intel will make an actual 8 core CPU some day, but they will not sell it as an 8 core, they will push it out as a 16 core CPU... btw, AMD has 12 already.

makes me wonder how many actual cores are in the Intel HD IGP...

so Rahul, your CPU is the FX-8320
That GT-610 is a weak little card, you can do better.
 


the EVGA GeForce GT610 has just 48 cores

this similarly priced Radeon HD 6570 has 480 cores

the GT610 will be the bottleneck, as the CPU would be waiting on the GT610, though similar issues may arise with the HD 6570 it has 10 times the cores. I recently was curious why nVidia calls them CUDA and AMD calls them Stream processors, to my surprise they are basically the same pipeline just different names. Like Gasoline from Shell or Chevron, its the same basic fuel, just different prices.
 


You need more than simple words to make your statement valid.

Relevant benchmarks or case studies are necessary. Just because you (or I) say so, that does not make it true.
 


but, you're a Moderator, so what you say is law lol.
 

Intels doesn't make GPU's therefore youcannot hybrid it, not because of copyrights.


You can only hybrid it to a low-end GPU, which already is outdated.
A CPU + dedicated GPU will be better.


Now here is something I dont understand. Intels 2nd generation is still considered high-end. Intel is making product that are made to be hold for a long time, and the product do hold for a long time.
Which is another reason intel is dominating the CPU-server market.
Are you saying AMD is in front currently? That must be a joke.
The reason intel product cost much more is because they make it themself. AMD is buying it(nvidia too).
Intel have been working on new things, we will soon see 14nm CPU, where from AMD just launces 28nm CPU's.


Intel havent been focusing on overclocking the last couple of generations.
Ivy-bridge and haswell have been rather bad for overclocking. They have been focusing on something else, because AMD failed on the CPU marked.
ehm, intels CPU will still stand strong in 2016.
Didn't AMD just send out a line with only 4 cores?
Haswell-e is going to be 8c/16t.
Also hyper-treading is not virtual core, it let 2 threads run on 1 core.
also intel already made a 36c/72t server-CPU, if i remember correctly.



Hyperthreading isn't anything special? that might be the first time I heard that.
Intel have never sold a a quad core with hyperthreading as a 8 core. The still call it a quad core.

 

this actually screams how little you know about APU's and Hybrid CrossfireX, yes there was an older version of Hybrid, but that was for a GPU integrated on the Motherboard, and it was a 4500 series Radeon HD, and ASUS called this GPU Nos, the new APU actually allows you to make budget friendly Hybrid CrossfireX of 7000 series GPU's. with AMD bridging the GPU's through the motherboard.


oh look at that you confirmed what i just said, only difference is, Intel recently caught their oopsie and no longer advertise them as "16 core" CPU's instead of falsely claiming the threads to be cores, they now state how many cores there are, and that it is pretending to be a 16 core with its hyper threading. 2 years ago i said to my friend, "If intel can stop their games and make a real 8 Core CPU can you imagine how awesome that'd be?" well they did, and yet still managed to botch it with Hyper Threading, what a joke. lol.


a thread is a process right? ok, and in the old days we called them Processors cuz they processed the thread, a process of streaming data. now when we take two threads and make one core do the work, we are emulating 2 cores over that one core, even windows thinks its two cores, who are you fooling with this? not me, i see the lies clear as day, you sell me a quad core i want a quad core, not a hyper threaded dual core!!


ACTUALLY no, Intel plans a 48 core parallel processing smartphone CPU at an overall clock of 1.66Ghz to 1.8Ghz, hardly worth noting since its just a redesigned GPU like that of the GT610 stuck in a phone.

i bet next you're gonna tell me about the Intel paper weight of 128 cores? the one they developed in China to do cloud processing but scrapped it because they would have charged $20,000 per 3.3inch square chunk of bugtacular waste?

While we're on the subject of CPU's how come none of you are frothing at the mouth for an IBM chip? their latest experiments with their Crystal CPU's have put them at 7Ghz without a CPU heatsink or fan

oh and i said all that, NOT Metalrenok
 

Intel never sold sold their product as 8 cores, that could have been from resellers.
What is so bad about hyper-threading I cannot see it?



Intel cores doesn't emulate 2 cores, how hard can it be to understand.
They let 2 threads run on 1 core. With the smart technology, it see if a core is running integer instructions it will add SIMD instructions. (example)
The only possible way an 8320/50 will win over a 4770k in multitasking is if it's pure integer instructions, which will never fucking happen. Windows cannot see cores, it can see threads.



http://www.techoftomorrow.com/2013/pc/intel-unveils-72-core-x86-knights-landing-cpu-for-exascale-supercomputing/

 
just to clear it all op for OP, as we went really off topic.

The 8320 will be a better for you simply because intel is out of your budget.
Amd 8xxx+ is considered the cheap high-end.
FX are AMD's gaming series CPU, and they dont handle general heavy work aswell as intel(not because they dont do it fine, because they do).

If AMD was better than intel, how come Intel is dominating the marked, and in every high-end workstation computer you will see a intel CP, and AMD wont even continue with their FX series(yet), they would rather follow intels path(CPU with IGP and fewer cores but with better architecture).

To be clear, I'm not a fanboy of intel, i'm a fanboy of power.
I have had a AMD CPU, intel CPU.
AMD GPU and nvidia GPU. I have simply picked what was better at the time.


 
This dbag of vmN is unselecting all the best answers, get over it.. you're a fanboy and that can be seen, I use Intel in my current rig and I love it, but the time comes when you have to give an honest opinion, and mine is that for the work the guy needs to do, AMD will perform better
 


I dont believe something wrong should be an answer, might aswell explain it right.
The answers giving, didn't provide true facts, and gave the wrong image of both manufacturers.