.
First of all, there are NO systems that nobody gives a damn about, because nearly all systems either ARE or can be connected to all other systems. So that's a senseless comment. My neighbors refrigerator can talk to a server in California which can talk to an entire network of systems, and so on, but you know this and are just being snarky. So EVERY system is a potential liability if it's vulnerable, especially if it's accessing your home network, which generally, it is.
And as I said before, yes, it's fully understood that there needs to be some other form of malware or exploit already in place on the target machine for any of these particular strains to be dangerous, but the problem is that as we've discovered there are PLENTY of systems out there, refrigerators, home computers, phones, ovens, whatever, that have already been shown to be particularly vulnerable to the kinds of infections that need to be present for this to happen, so it's really not a stretch to say that this can and likely will happen at some point.
How many people do you know of that are going to patch the software on the refrigerator, or update it's firmware. Practically none, so if the hardware is vulnerable and you are selling it that way knowingly, that can't be acceptable to anybody with a half a brain.
I agree that Intel is "unlikely" to scrap the whole gen, but I also think it's incredibly stupid and short sighted not to. They're already in hot water, have multiple class action lawsuits filed against them because they released Coffee lake knowing full well it was vulnerable, and now we're talking about releasing another round of vulnerable, or hamstrung, if patched by way of OS and firmware, processors AFTER the fact? Seems incredibly risky from a PR standpoint if not a legal one.
And as far as the performance issues are concerned, unless you are willing to stick your head in the sand it's kind of hard to ignore the reality which is not as you've described. We've seen TONS of data already that refutes what AMD and Intel have been piling on about there being minimal decreases in performance. When architectures seem to barely gain more than ten percent per generation, then taking a ten percent hit while gaming or up to a thirty pecent hit depending on workload for PCIe and SATA SSD performance, is pretty damn noticeable in my book.
I don't think I need to post links to the data on that, since it's plastered all over this and many other forums where the ACTUAL data from testers is getting dumped, so if you want to believe the sugar coated crap the damage control people are slinging to the front pages and the media, be my guest, but much as with the ostrich whose head is in the sand, you're going to get either run down or eaten whether you can see the danger coming or not.
Maybe you're ok with a ten percent hit in some types of performance and a thirty percent hit in others, for the record, I'm really not ok with that.
I feel touched that you care about my grandmother's computer which she solely uses to write a letter in OpenOffice and mail me the document to print it out
😉
Joke's aside: yes, unpatched systems are an issue. But as described by others, they are an issue anyway, no matter if Meltdown happened or not. Yes, Meltdown and Spectre are really dangerous. But by the time you can exploit it, any system is already compromised to a degree that Meltdown & Spectre are the tip of the iceberg.
I do reckon that with servers and cloud services, this is a real threat and it further encourage my scepticism of cloud services that aren't local (meaning not connected to the internet, on a separated network). But that's not what I was talking about. I was talking about consumer grade CPUs. those i3-9100s, i5s and i7s. Yes, I've read the test concluding that database work is heavily slowed down by this. and it really sucks. But - and I may stand corrected, but I'm pretty sure tbh - the majority of people are still using their computer for light office tasks, correspondence (whether it's emails, messengers or social media), Netflix (and maybe gaming) and if your machine wasn't already struggling with these tasks I still haven't seen data indicating you'll have to fear a severe performance decrease. I happily revert my point of view if there's evidence in wrong, so far I haven't seen any -- maybe you know more than I do?
As for being fine with that - I guess there are different breeds of people. I personally don't care if the eSATA port on my mainboard is broken since I don't use it and it hasn't been the feature I bought my board for.
Same with my car - I don't care whether it's able to go 190 or 220 per hour, it's well above the speed limits and I won't be able to drive that fast anyway. As long as the acceleration, breaks & comfort is still what I'm expecting, I'm fine with that. I do reckon there are people who have a different opinion on this, but after years in customer support I can say, I hate people who are hysterically complaining about things that don't concern and apply to them.
Also this issue is exploitable since...I know all of the i-core CPUs got this issue but I think it was the same with CPUs since the early 2000s? Late 90s? And the world is still alive. Don't get me wrong, it is a major concern but the hysteria going on is a bit unwarranted. There's people worried that they can't upgrade their FX-6300 to an i5-8600k for their gaming rig bcs of meltdown, not because of security but because of performance issues. And that's just bloody stupid. I blame that on the hype-hysteria with which
As for refrigerators and the IoT, I must say that I know too little about it. I don't see a reason to connect my refrigerator to the internet in the first place. I don't see any benefit in doing so. It's one of these "because I can" things. Sure, I can also only eat at McDonald's, but what's the benefit of it?
As for Intel fixing this and skipping lines -- and as a leftist I really hate myself saying this -- I hope they don't. Addressing this flaw will probably take major resources. That costs a lot of money.
Last news I've heard of cannonlake is that the 10nm process (do you say process? English isn't my native language)is giving them major headaches, the CPU 'otherwise' in terms of features, design, architecture is finished.
Scrapping this line completely would be foolish, a major loss with 3 possible outcomes:
- Intel goes out of business (which would be pretty bad considering their know-how and the potential lack of competition arising. Without Ryzen I'm certain we wouldn't have seen Intel mainstream hexacores.
- Intel getting vulnerable to hostile takeovers (by companies who may not share the same goals and are more interested in their parents than their consumer/server CPUs)
- Intel having to downsize and therefore not being able to dedicate as much resources to fixing this as we'd like them to, resulting in insufficient R&D, such as badly designed new architecture with flaws on their own, major bugs or a general hold in CPU performance increases
So from a totally egotistic point of view I hope Intel sells a shitload of CPUs while losing some stock market value, just enough so they start to work thoroughly again.
And of course people patching their systems. (While I personally find win10's forced updates annoying I do understand why it was implemented like that..)