Intel Chips' List of Security Flaws Grows

Status
Not open for further replies.


You did see this line as the last sentence in the second to last paragraph, right?

"Intel's own benchmarks showed that the performance impact of the patches is negligible."

 

valeman2012

Distinguished
Apr 10, 2012
1,272
11
19,315


I seen these kind of Benchmarks from the Own companies, but we have try it yourself to see how many fps we lost.
By the looks of these Endless Spectre bugs,,,lets say you security all spectre patched your Intel i5 7600K (stock) it will perfrom weaker than a i5 6600K (Unpatched)
 

rantoc

Distinguished
Dec 17, 2009
1,859
1
19,780
Great, spectre patches set back perfrmance about 1 year due to intels low performance gains with its low competition then IE it was in milking mode... what will happen this time when patched? Another year of performance loss equivalent when "patched"?

Where is my refund?
 

Dantte

Distinguished
Jul 15, 2011
161
58
18,760


"...negligible" is a relative term; negligible for you, or Intel, may not necessarily be negligible for leoscott, he may value a 1-FPS lost. Needless to say, there are "more slowdowns", these slowdowns are just "negligible" in the eyes of Intel.
 

wownwow

Commendable
Aug 30, 2017
37
1
1,535
The "Meltdown" and now "Foreshadow" triplets, these "repeatedly not following the specs" instances are now unlikely typical design bugs but the well-planned, INTENDED (as the ex-CEO said "the intended designs") cheating for performance!!!

All the sellers selling the products with the known INTENDED flaws should be subjected to the fraud in a criminal court for selling such products!

FTC employees are still hibernating with paycheck auto-and-direct deposit? The organizations of consumer's rights are also still hibernation?

Intel is now upgraded to "Cheating Inside" from "Bug Inside"!

Buy Intel "Cheating Inside" products and get unlimited all-you-can-have patches for free!
 

alextheblue

Distinguished

The alternative isn't great either, FYI. Would you like a Bonnell-derived chip with complete garbage performance?
 

Sam Hain

Honorable
Apr 21, 2013
366
0
10,960


That last part of your statement, instantly put Cousin Eddie (from N'tl Lampoon's X-mas Vacation) in my head... Are we getting cousin Eddie (perpetually) as Intel customers???
 

termathor

Distinguished
Jan 16, 2015
75
1
18,645
"That means most older PCs and laptops may not be fully protected against the Foreshadow attacks."

Don't start me on this. ASUS Z97 mobos have yet to have a proper new UEFI patched against Spectre to date !
 

Kaz_2_

Prominent
Jul 12, 2017
24
0
510
It does affect intel cpu performance you will never hear one truth words from them. it tell dozens of lies
 

stdragon

Admirable


It's been reported [1] that Intel Atom and Celeron processors since 2013 have been affected by Spectre. In addition, ARM also acknowledged [2] the following CPUs being affected as well to all four variant of the exploit to some degree or another - Cortex-R7, R8, A8, A9, A12, A15, A17, A57, A72, A73, A75, and A76. Apple also reported [3] the A series CPUs having been affected Spectre as well with iOS migrations now in place.

The problem here isn't Intel per se. It isn't a "bug". It's a fundamental flaw within the philosophy of computing science in how to execute data. So when Intel said it wasn't a "bug", I agree with them 100%; their CPUs are working exactly as designed and expected. The problem here was a fundamental oversight in how to secure executed data.

For now, at least with Intel (Haswell and newer), Windows will utilize Process Context ID (PCID) so long as the CPU also supports Invalidate PCID (INVPCID) in hardware [4]. But ultimately, this entire clown show will have to be revisited from the ground up so there's seamless CPU with OS integration from a security standpoint. How that will happen exactly is beyond me. But in the meantime, you will continue to have either mitigation, or features such as Hyper Threading disabled entirely [5] (such as in OpenBSD) to preempt any future Spectre vulnerability variants


Cited reference:
[1] https://www.heise.de/security/meldung/Spectre-NG-Intel-verschiebt-die-ersten-Patches-koordinierte-Veroeffentlichung-aufgeschoben-4043790.html

[2] https://developer.arm.com/support/arm-security-updates/speculative-processor-vulnerability

[3] https://support.apple.com/en-ca/HT208394

[4] https://arstechnica.com/gadgets/2018/01/heres-how-and-why-the-spectre-and-meltdown-patches-will-hurt-performance/

[5] https://thehackernews.com/2018/06/openbsd-hyper-threading.html
 
Jul 12, 2018
4
0
10
Alright lets just everyone calm the eff down. Most of the comments just seem like cheap shots by people who begrudge intel or simply prefer a different company but don't actually state a lot. How many people have been sent back to the year 1990 as far as cpu performance? I remember computers even earlier than that. If you want to complain about slow try using the computers we had to work with in the 80s and 90s. I would say the equivalent wait for today's generation is probably only 5-10 years compared to the wait I went through for decent computers. I was born in 1981 so by most research standards I don't belong to any generation lol. I still grew up knowing what its like to not have all the world's knowledge in the palm of my hand. So I also know how to properly judge my intelligence on a topic. I'm not a computer science major so beyond quantum computing I would not have the slightest clue on how you would build and implement cpu's from the ground up without tearing economies apart. Could such a cpu seamlessly talk to current hardware? Or would it be a cutoff date where nothing before can effectively speak to anything after? If it were just a coding issue I could better fathom it. Regardless even if I turned hyperthreading off I'm left with 4 cores in my laptop and 6 in my desktop. Compared to 1 for decades. A terribly slow one at that. I can write more powerful programs on my graphing calculator for math problems than any computer before the year 2005. If you don't own a modern graphic calculator well that's actually good since I don't think that industry is really needed anymore with laptops and tablets. But if you have to do without hyperthreading how long have most of us gone with 1 core and then 2 cores for most of our lives. Even today 4 cores can be overkill. I understand programs are better at using more cores but when the status quo is still 2 and 4 core cpus I can't imagine this is going to rock most people's world's to not use hyperthreading. That's if that's really necessary. So far it sounds like a highly reactionary and appeal to emotion argument since people respond when you make them afraid or frenzied. The reason Intel is stuck being the de facto spokesman for this is because they own 80% of the market last I checked. Which wasn't long ago since I do check these things from a purely stock market perspective. I'm sure AMD processors and any other processor will be found to be fallible. The way I will handle it is ration logic. If most programmers and leaders in computer science state there's no need to shut off hyperthreading as long as you have a computer that meets these safety requirements or higher in all other areas I will trust them. By the way I don't notice a huge difference between hyperthreading or not on either of my computers so I don't think it will interrupt most people. Its definitely not worth paying for more than 6 cores when I priced my last desktop out and saw the jump in prices. I'm not spending what could buy me a high end computer to by the latest AMD 16 core cpus or i9 processors. So if you are like me whether you like intel or not you better hope their outlook on negligible is the same as yours when it comes to performance loss since every chip maker should be subjected to the same vulnerability. I did notice a .01% of the market is occupied by a company and chip I never heard about. I'm not inclined to look it up since I wouldn't understand it anyway but if anyone else does feel free to explain it. But everything else I've told you is valid regardless of my expertise (Its physics and statistics if anyone is curious). And quantum computers for an average person are as far off as a computer for the home was during the space race. That's assuming it pans out since it only theoretically should work. I do know there are functioning ones right now but take up the size of a house, require coolants the average person wouldn't be able to properly handle at home, and the most complicated thing I've seen it accomplish is 3 times 5 equals 15. But you're dealing with qbits and generally know one understands what a qbit really is unless you've had extensive education in physics. Quantum physics to be more specific. The easy way to explain it is that you can have a 0 or a 1 or both. In reality you can have [0,1]. Meaning any or every state between and including 0 and 1. That would nullify most security concerns in the future so long as people avoid hitting attachments from emails lol. Its even more full proof than using pgp encryption over a VPN to communicate with someone. Cryptography isn't what makes it so interesting though. Its the scalability of such a system and its computing power since each bit is not confined to any logic or reality we are used to on the macroscopic level. Quantum particles can be in multiple places at the same time. That and quantum entanglement are areas where a few dozen people really know the mathematics behind it and what's happening so I won't pretend to. I only know it makes computing much faster than any transistor based chips possible. And may be the key to true intelligence as opposed to convincing algorithms. I at least hope so otherwise we need to question if we just use algorithms and don't have true intelligence ourselves.

The bottom line is you should be more concerned with any number of social issues between overpopulation, nuclear proliferation, to climate change before this is your largest complaint if you have to do without hyperthreading. Hell I would worry about an extinction level natural event occurring first. Or human caused one which statistically seems to be happening and following the progress of all past extinction events. They take around 40k years and start slowly with large mega fauna dying off. So I'm not saying ignore this issue but don't ruin an engine by putting out a false fire alarm. (since an extinguisher generally destroys electronic components). I would have used computers for the analogy but that is what we are talking about to begin with. Again the technical details of what is happening are not things I'm by any means an expert on. I'm going off what most programmers tell me. And I think they assume I'm not using any legacy ware. I'm sure a set of recommendations could be put together on who is safe and who should turn off hyperthreading to prevent the issue. And worse comes to worse have the computer only use hyperthreading when its processing local programs. Trust me I have to run really computer intense mathematical series and statistical problems. Physical cores are what make it go faster not hyperthreading. Or as long as graphics cards don't suffer from the same type of flaw I have found borrowing their power for math intense programs and physics problems to be beyond awesome. But I have had to work with computers in the 90s and a time when the TI89 titanium did a better job than most computers at solving problems. After college the first core 2 duo's came out and the GTS 8800 with sli mode came out. After that no one should need a graphing calculator again except to keep texas instruments in better shape business wise. Take it or leave it but I also forgot that we have SSDs being common, DDR4 memory modules, GDDR6 memory coming out, usb ports that transfer faster than FSBs used to, and wifi that lets you download your porn as quickly as you can type (which is where you probably pick up attempts at attacks on your cpu). When 5G networks debut your cellphone and tablet should transmit data up or download faster than wifi speeds over cable modems. My guess is it won't be as cheap since it takes a lot of effort to build all those towers. And you might be surprised to learn that almost none of your cell phone data actually uses satellite transmissions which makes cell towers more important. Sorry I have ADHD and learned to type around 70wpm so don't get bent out of shape if something I wrote I didn't get correct. I'm simply going by what actual experts have told me and my own experience with only using my physical cores. Which are becoming much more numerous so I hope more programs can make use of them regardless of HT. 6 physical cores alone is something I never thought I would see and afford on rather inexpensive high end computer. I spent 2000 dollars and put it together. This was right before the GTX1080ti bumped up hundreds of dollars in value due to friggin miners. There's something you can worry about and solve. How are the cryptocurrencies that rely on intense calculations supposed to have long lasting power? They eat up power like a shark feasting on a whale and at some point it won't be worth spending money on the power or all the currency will be mined which doesn't give me incentive to mine if I had any incentive to begin with (I actually bought thousands of BTC at 1 cent and spent very little of it since it couldn't be spent really). A few years later I have a nice, nice nest egg for a few hundred dollars lol. And no I never saw it becoming as valuable as it. I'm just lucky enough to forget I had a lot of unspent coins late last year. I also mostly do actuary work so anytime anything increases in value like that, especially when joe everyman jumps on board, I know its time to abandon ship until things calm down again.

I wrote a lot but basically I wouldn't worry about this. Any decrease in speed would most likely be a placebo effect on your part since it takes the brain miliseconds to process and stitch together all our senses to make it seem as if its all in real time. The brain is a good example of a quantum computer actually. We just don't notice all the calculations its doing every second but its doing a lot. You don't balance yourself, breath, pump your heart, produce enzymes, etc. with thought. Lets just hope no one ever learns to hack biological tissue.
 
Status
Not open for further replies.