Intel Releases Meltdown, Spectre Patch Benchmarks

  • Thread starter Thread starter Guest
  • Start date Start date
Status
Not open for further replies.
Interesting that the SysMark tests show less overall impact on the Win 7 system than on the Win 10 system. I'm guessing that those differences are attributable to Intel using an NVMe SSD on the Win 10 system and a SATA SSD on the Win 7 system? Much like the system with the HDD has the least negative impact. Any thoughts on that?
 
Bios and software are not patched, and we all know at the minute you are multi-tasking, alot of the performances goes down the toilet. Some are reporting more than 30% performance lost in the server or business enterprise softwares.

It`s a damn mess and Microsoft proved it themselves by forcing a buggy patch on old AMD CPUs. It is far from over and it will only get worst as more patches will be needed.

Shame on Microsoft for not creating a Meltdown patch and a Spector patch instead of a single patch. AMD doesn`t need the first one but they force the users to swallow a botched work.
 
IMHO, how it impact users depends on what the computer is used for. IF you are using the computer for gaming, youtube, facebook and surfing internet. I don't think it really matters.

Although it does shows 14% impact for office applications, most of these users are just using word, excel, powerpoint...outlook.....again, insignificant (CPU is idle most of the time).

Hence, I would say it will affect enterprise and data centers rather than end users. I feel that even if end users don't patch, it doesn't really matters. Software issues are a far bigger concern than this. We have already seen so many exploits and company servers being hacked to reveal customer data and even CC data. Nothing to do with spectre and meltdown.....
 

So, that's why it's good we now have data.


I think the point of these benchmarks is to show the user-perceivable impact. So, a well-designed "Responsiveness" test should serve as an indication of how responsive the system feels. Unless you have some information about why the benchmark fails to reflect that, then I think it should be taken at face value.

As for the other tests, they're designed to reflect various other situations where the computer's performance is noticeable by users. While it's true that not everyone needs a high-end rig, it's really up to the user to decide whether a benchmark is relevant to them (i.e. "do I sit around, waiting for the computer to do media creation, productivity, etc.?").


Personally, I almost never have to wait for my productivity apps, but then I'm a pretty light-weight user of such programs. The two most relevant operations for me are web browsing (and yes, certain page loads easily spike my CPU - it's not 1995 any more) and software builds. As incremental builds are highly I/O bound, I think the impact will definitely be significant for them.


That's an interesting philosophy on security. I doubt anyone who's ever been infected with malware would agree that known vulnerabilities should go unpatched. Effective security depends on a layered approach, and without patching these vulnerabilities, it might be possible for someone to combine them with a vulnerability in a web browser to gain admin privileges. That's a big deal.

Sorry to be tough, but you're new and your points read like they're straight from a script out of Intel's PR. I agree that the most serious performance issue is going to be faced by cloud providers & appliance vendors that suddenly need to add more capacity, but I think your views on security are out of step with the times.
 
I am not willing to sacrifice the kind of performance noted for my Windows 7 laptop running a Sandy Bridge i7 CPU. That is stupid, especially given that there really is NO threat. Now that so many systems are going to be updated, there is little reason for any scumbags to try to exploit these vulnerabilities, IMO. From my perspective, the cure is far worse than the disease, especially on older hardware / OS combinations. It just is not worth it. So, I believe Microsoft should make a way to have these patches be OPTIONAL and AVOIDABLE and UNINSTALLABLE. This is crap!
 
I installed all patches for my z620 running W7 and 2 E5-2680s v2 and have seen no differences in Cinebench r15. That is assuming I got the right ones and I still have to wait and see if HP releases a BIOS patch that goobers things up. Even though I saved a lot of money going this route over Threadripper I am starting to kick myself as even though my computer does everything I want it to and more it might be all for naught in the near future.

If push comes to shove then I would rather take my system offline and build a super cheap Ryzen web-browser.
 
Even if the performance hit were only two percent it's still unacceptable!

People bought the higher end CPUs to get performance out of them. Who buys a 6700K or 8700K when an 6400 / 8350 will for most games suffice?

Then to have this come along and suck the very life out of our CPUs.. Pushing us back essentially two generations or down several models of CPUs. That cost of that across the board has got to be nothing short of phenomenal!

And the message seems to be clear.. Upgrade to Windows 10, as Windows <10 are being hit harder. I'm surprised we haven't seen the tin foil hat group start saying this is another Microsoft/Intel collusion to force sales of Intel upgrades and Windows 10 (Though I wonder how the exploits here are going to help Intel's sales at all).

I just spent more $3000 upgrading my PC to a I9-7900 (~ $1900 AUD counting delidding), and $750 for a top end motherboard + an upgrade of my 950 pro 512 to a 960 Pro 1TB and a new 280mm cooler with Noctua NF-A14 PWM iPPC fans to keep it all cool. It's clear what I'm after - every ounce of performance I can scrape out of the system.

And Correct me if I'm wrong - the patches are part of the cumulative updates. So the only way to avoid them (but why would you want to?), is essentially to never install another patch on Windows...

God, what a mess 🙁
 

Cinebench shouldn't be much affected, as it's mostly doing userspace computation rather than making lots of calls into the kernel (i.e. as would be used for I/O-heavy workloads).


Why would it be for naught? If you mainly wanted to do gaming and/or 3D rendering, the impact should be minimal.
 

Really? Malware often chains together multiple exploits, in order to get its claws into your system. It's not necessary to have just a single vulnerability that directly lets someone gain admin privileges from a malicious web page in order for that to happen.


Lots of corporate systems don't pull updates directly from MS, and there are pirated copies of Win7 that can't get updates. Malware often attempts a variety of techniques, rather than depending on a single vulnerability. And once code to exploit these vulnerabilities is out in the wild, it's easily included in the rootkits and libraries on which much malware is based.
 


If the end result is minimal then I shall be a happy boy but I keep seeing hearsay about how it will have this much percentage of an effect on this particular task from this side or that.

The developers of my rendering software (Redshift) are legitimately worried that this could impact performance during the Initialisation phase of Rendering when scene assets are loaded onto the GPU and when the GPU goes Out-Of-Core whereby all available VRAM is used up and the the software switches to system memory instead while streaming to and from the HDD/SSD.

Like I said, if nothing/little happens then I will be perfectly content otherwise I will take my rig offline and use a dedicated internet box.

P.S. In case you are wondering why I chose CPUs with so many core when I will be mostly rendering using my GPUs it is because I use Keyshot to get some better looking WIPs out of ZBrush while also doing simulations in Houdini. I also troubleshoot the occasional Arnold glitch for a friend.

EDIT: I am really sick and this slipped my mind, the RS developers are looking into and the prognosis is good for general use but the OOC situation is a mixed bag so I am going to wait and see.
 
As i read on other popuar tech sites, one thing is certain they say,.. Whether its an 8 percent to 24 percent performance slow down, if you own a high end gaming rig or a high end workstation that u use for content creation or a high end laptop that you use for other multimedia editing and office and web applications, those performace issues caused by these bugs will make ur computing experience somehow a bit disappointing knowing that everytime you open your computer it doesn't give the performace that you expect and takes a way the sense of why you bought a high end system in the first place.

IMO,
They are kinda right and i agree to them. It kinda sucks really.
 

Just imagine how much faster your next system will seem!
 
Status
Not open for further replies.