Dantte :
Goto https://www.av-test.org for comparisons of software, this is your best resource for any and all info related to AVs and malware.
As far as security goes, MS Defender got a 98.8%, Kaspersky has ALWAYS been 100%. This is close enough to ignore the question of "who's product is more secure..."
BUT, when you compare the system resources each piece of software uses, MS Defender is a hog and a PC takes a massive performance hit of upto 50% in some instances with most other categories in double digits (37% and 29%); while Kaspersky only maxes out at 23% in the same 50% category as Defender and with most others in single digits (2% and 8% respectively). This alone should be alarming enough to disable MS Defender and get a 3rd party solution; how can an embedded program tax a system at this level while at the same time providing LESS security?
As far as security goes, MS Defender got a 98.8%, Kaspersky has ALWAYS been 100%. This is close enough to ignore the question of "who's product is more secure..."
BUT, when you compare the system resources each piece of software uses, MS Defender is a hog and a PC takes a massive performance hit of upto 50% in some instances with most other categories in double digits (37% and 29%); while Kaspersky only maxes out at 23% in the same 50% category as Defender and with most others in single digits (2% and 8% respectively). This alone should be alarming enough to disable MS Defender and get a 3rd party solution; how can an embedded program tax a system at this level while at the same time providing LESS security?
Kaspersky has not "always been 100%".
Also, trusting a single source such as av-test.org, will lead users to sacrifice security. As has been documented by several security researchers, these "all in one" solutions tend to create more security holes than they help close.
I always liked this statement from av-comparatives...
We would like to point out that while some products may sometimes be able to reach 100% protection
rates in a test, it does not mean that these products will always protect against all threats on the
web. It just means that they were able to block 100% of the widespread malicious samples used in a
test.
The tests done by av-test.org are relatively easy to score well on. They're typical file scan tests performed using "malware" samples. The sample sizes are small enough to be meaningless. They typically use a few hundred to a few thousand samples while there are typically a few hundred thousand active pieces of malware "in the wild" at any one time. In the case of the January and February tests, 11524 samples were used. So, they managed 100% detection of around, 10% or so of the active malware "in the wild".... Their joke of a "real-world" test only used 195 samples.
Just because an antivirus solution scores better than another in a test, such as what av-test.org performs, it does not mean that that particular piece of software will detect or prevent malware infections better than competing products in real-world usage. AV-Comparatives has shown cases in the past where products have performed incredibly well in file detection but failed in "real-world" scenarios.