Alright lets just everyone calm the eff down. Most of the comments just seem like cheap shots by people who begrudge intel or simply prefer a different company but don't actually state a lot. How many people have been sent back to the year 1990 as far as cpu performance? I remember computers even earlier than that. If you want to complain about slow try using the computers we had to work with in the 80s and 90s. I would say the equivalent wait for today's generation is probably only 5-10 years compared to the wait I went through for decent computers. I was born in 1981 so by most research standards I don't belong to any generation lol. I still grew up knowing what its like to not have all the world's knowledge in the palm of my hand. So I also know how to properly judge my intelligence on a topic. I'm not a computer science major so beyond quantum computing I would not have the slightest clue on how you would build and implement cpu's from the ground up without tearing economies apart. Could such a cpu seamlessly talk to current hardware? Or would it be a cutoff date where nothing before can effectively speak to anything after? If it were just a coding issue I could better fathom it. Regardless even if I turned hyperthreading off I'm left with 4 cores in my laptop and 6 in my desktop. Compared to 1 for decades. A terribly slow one at that. I can write more powerful programs on my graphing calculator for math problems than any computer before the year 2005. If you don't own a modern graphic calculator well that's actually good since I don't think that industry is really needed anymore with laptops and tablets. But if you have to do without hyperthreading how long have most of us gone with 1 core and then 2 cores for most of our lives. Even today 4 cores can be overkill. I understand programs are better at using more cores but when the status quo is still 2 and 4 core cpus I can't imagine this is going to rock most people's world's to not use hyperthreading. That's if that's really necessary. So far it sounds like a highly reactionary and appeal to emotion argument since people respond when you make them afraid or frenzied. The reason Intel is stuck being the de facto spokesman for this is because they own 80% of the market last I checked. Which wasn't long ago since I do check these things from a purely stock market perspective. I'm sure AMD processors and any other processor will be found to be fallible. The way I will handle it is ration logic. If most programmers and leaders in computer science state there's no need to shut off hyperthreading as long as you have a computer that meets these safety requirements or higher in all other areas I will trust them. By the way I don't notice a huge difference between hyperthreading or not on either of my computers so I don't think it will interrupt most people. Its definitely not worth paying for more than 6 cores when I priced my last desktop out and saw the jump in prices. I'm not spending what could buy me a high end computer to by the latest AMD 16 core cpus or i9 processors. So if you are like me whether you like intel or not you better hope their outlook on negligible is the same as yours when it comes to performance loss since every chip maker should be subjected to the same vulnerability. I did notice a .01% of the market is occupied by a company and chip I never heard about. I'm not inclined to look it up since I wouldn't understand it anyway but if anyone else does feel free to explain it. But everything else I've told you is valid regardless of my expertise (Its physics and statistics if anyone is curious). And quantum computers for an average person are as far off as a computer for the home was during the space race. That's assuming it pans out since it only theoretically should work. I do know there are functioning ones right now but take up the size of a house, require coolants the average person wouldn't be able to properly handle at home, and the most complicated thing I've seen it accomplish is 3 times 5 equals 15. But you're dealing with qbits and generally know one understands what a qbit really is unless you've had extensive education in physics. Quantum physics to be more specific. The easy way to explain it is that you can have a 0 or a 1 or both. In reality you can have [0,1]. Meaning any or every state between and including 0 and 1. That would nullify most security concerns in the future so long as people avoid hitting attachments from emails lol. Its even more full proof than using pgp encryption over a VPN to communicate with someone. Cryptography isn't what makes it so interesting though. Its the scalability of such a system and its computing power since each bit is not confined to any logic or reality we are used to on the macroscopic level. Quantum particles can be in multiple places at the same time. That and quantum entanglement are areas where a few dozen people really know the mathematics behind it and what's happening so I won't pretend to. I only know it makes computing much faster than any transistor based chips possible. And may be the key to true intelligence as opposed to convincing algorithms. I at least hope so otherwise we need to question if we just use algorithms and don't have true intelligence ourselves.
The bottom line is you should be more concerned with any number of social issues between overpopulation, nuclear proliferation, to climate change before this is your largest complaint if you have to do without hyperthreading. Hell I would worry about an extinction level natural event occurring first. Or human caused one which statistically seems to be happening and following the progress of all past extinction events. They take around 40k years and start slowly with large mega fauna dying off. So I'm not saying ignore this issue but don't ruin an engine by putting out a false fire alarm. (since an extinguisher generally destroys electronic components). I would have used computers for the analogy but that is what we are talking about to begin with. Again the technical details of what is happening are not things I'm by any means an expert on. I'm going off what most programmers tell me. And I think they assume I'm not using any legacy ware. I'm sure a set of recommendations could be put together on who is safe and who should turn off hyperthreading to prevent the issue. And worse comes to worse have the computer only use hyperthreading when its processing local programs. Trust me I have to run really computer intense mathematical series and statistical problems. Physical cores are what make it go faster not hyperthreading. Or as long as graphics cards don't suffer from the same type of flaw I have found borrowing their power for math intense programs and physics problems to be beyond awesome. But I have had to work with computers in the 90s and a time when the TI89 titanium did a better job than most computers at solving problems. After college the first core 2 duo's came out and the GTS 8800 with sli mode came out. After that no one should need a graphing calculator again except to keep texas instruments in better shape business wise. Take it or leave it but I also forgot that we have SSDs being common, DDR4 memory modules, GDDR6 memory coming out, usb ports that transfer faster than FSBs used to, and wifi that lets you download your porn as quickly as you can type (which is where you probably pick up attempts at attacks on your cpu). When 5G networks debut your cellphone and tablet should transmit data up or download faster than wifi speeds over cable modems. My guess is it won't be as cheap since it takes a lot of effort to build all those towers. And you might be surprised to learn that almost none of your cell phone data actually uses satellite transmissions which makes cell towers more important. Sorry I have ADHD and learned to type around 70wpm so don't get bent out of shape if something I wrote I didn't get correct. I'm simply going by what actual experts have told me and my own experience with only using my physical cores. Which are becoming much more numerous so I hope more programs can make use of them regardless of HT. 6 physical cores alone is something I never thought I would see and afford on rather inexpensive high end computer. I spent 2000 dollars and put it together. This was right before the GTX1080ti bumped up hundreds of dollars in value due to friggin miners. There's something you can worry about and solve. How are the cryptocurrencies that rely on intense calculations supposed to have long lasting power? They eat up power like a shark feasting on a whale and at some point it won't be worth spending money on the power or all the currency will be mined which doesn't give me incentive to mine if I had any incentive to begin with (I actually bought thousands of BTC at 1 cent and spent very little of it since it couldn't be spent really). A few years later I have a nice, nice nest egg for a few hundred dollars lol. And no I never saw it becoming as valuable as it. I'm just lucky enough to forget I had a lot of unspent coins late last year. I also mostly do actuary work so anytime anything increases in value like that, especially when joe everyman jumps on board, I know its time to abandon ship until things calm down again.
I wrote a lot but basically I wouldn't worry about this. Any decrease in speed would most likely be a placebo effect on your part since it takes the brain miliseconds to process and stitch together all our senses to make it seem as if its all in real time. The brain is a good example of a quantum computer actually. We just don't notice all the calculations its doing every second but its doing a lot. You don't balance yourself, breath, pump your heart, produce enzymes, etc. with thought. Lets just hope no one ever learns to hack biological tissue.