Microsoft: NSA’s Bug Hoarding To Blame For WannaCry Ransomware Spread

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Doing the right thing is expensive and usually makes you economically un-viable. That is the sad truth for most IT related things in the world right now.

There's a lot of catchy phrases and acronysms to justify not doing the right thing, but it's always down to cost. Investors don't want to put products out there with the extra added costs for security (like, REAL security) unless there's very strict market laws around it (I work in such a market and we always have 10x regular budgets for security alone). And this is just talking about regular pieces of software; not an entire Operating System.

In an ideal world, everyone would be a responsible adult and look for the "bigger good", but reality is everyone is a shallow and petty individual that wants short term benefits. Reality for you all, huh?

Cheers!
 


Patching would have helped but for some reason people assume that this ransomware only works on OS's that are older than windows 7. I think there are several fingers to point here, the first would be security culture or rather the lack there of. Second would be poor funding, some of these places could not afford to upgrade to newer systems little lone higher a competent IT/infosec guy that know better then have external facing systems with SMB and RDP allowed. Third falls a bit under the first, the lack of a security minded culture has government agency's and company's that sell exploits for the purpose of spying, thinking that it's ok to horde and that they will be exempt from losing control of those exploits. Forth still falls under the first, company's that refuse to implement bug bounty's and/or refuse to reward the security researchers who find them. Fifth is also under the purview of the first, company's that create products and refuse to give security a priority. these are the problems that plague the info sec community today as well as the world. Does Microsoft fall under one of these? probably, does the NSA? yes. who is at fault? Everyone.
 


I agree with you but with the time you learn to do better code know what work and what does not. Template are a good thing for that so you don't have to redesign your code all the time and you save time too.
 


Code evolves, just like platforms around the code evolve. They get more complex with added functionality and with added complexity the testing grows quite a bit more than actual development.

So, in short: not really.

Cheers!
 
This would only be a wake up call to those of us that are already awake and know the government is full of lowlife scum and the NSA as well as the CIA and FBI could care less about the people they was hired to protect. Their end goal is to make themselves look good, fill their quotas and their private pension prisons. This information can be delivered to the people everyday but the fact is there are far to many of us that care about our privacy to ever stand up to the man and demand this stops! We live in a world where mindless sheep can vote and there simply is not enough of us that know better to make a difference.
 


You said

...which makes it sound like you're saying they can sit back and wait until they know that someone else knows about a bug, before they have to fix it. My position is that they should try to fix the bugs they know about. Whether they learned about it from someone else or through internal testing. And follow a sane prioritization based on risk (and, yes, taking into account whether it's been reported or is actively being exploited).

However, that basically amounts to damage control. What they really need to do security by design - not as an afterthought. And they probably need to hire outside firms to find exploits, as well as offering bounties.

I'm certainly not under the impression that it's feasible to ship perfect software. Maybe in small amounts for mission critical devices, but not anything with significant scale.

Going forward, all tools need to be brought to bear on the problem of security. Everything from the programming language level to runtime libraries and the runtime environment. I think we're learning that simply writing software to do XYZ isn't the hard part. Doing it securely might actually be the biggest challenge.
 

And my point is that Microsoft didn't know about it until two months ago and it did fix it back then so even by your standards, Microsoft has already done everything it could reasonably be expected to do.
 
Right, but the whole reason I even replied to your original message hinged on your use of the word "disclosed". I thought it was clear from my posts that waiting for disclosure was the real issue I was raising.

That said, I stick by my points about security-oriented design & engineering, as well as ongoing testing.
 

You can do a lifetime worth of on-going testing and you still won't catch every bug, which is why we see a handful of 10+ years old bugs discovered in every major piece of software at least that old and still under active development every year. There are practical limits to how much time, effort and money can be sunk into testing before you have to call it done until proven otherwise.
 

You're assuming methodologies remain unchanged. Static analysis is improving, as are programming languages and design practices. I don't know as much about the testing side of things, but the point is that the industry has to step up their game - and not just by throwing more people-hours at it.
 

No, I'm assuming that no matter how much more time, how many more resources and new technologies you throw at testing, it will still be incomplete. Code will never be 100% bug-free.
 
Sure. I didn't mean to say otherwise. Just that software vendors need to do a better job... and they can!
 

Only if you make it economically viable. If you want software developers to spend 10X more resources in testing, expect to pay substantially more and wait longer for that testing.

The impossibly huge testing effort that is required to design truly secure software is why absolutely critical security stuff is typically handed down to embedded secure micro-controllers running a tiny security kernel or dedicated logic that can be fully audited within reasonable means and effort.
 
So that's my point. The way to make more secure software is not simply throwing more people-hours at it. We're probably already past the point of diminishing returns, there.
 


It's about how the market you're targeting with the software is regulated.

Laws have a BIG influence on what companies put in terms of effort.

If *people* wants a more secure software ecosystem for a particular segment, then the people itself should pressure their representatives into legislating around that.

A quick example is the consumer related information market: it is heavily regulated so the laws will punish companies dearly if they mess up. That is why you don't see many big corporations that handle consumer information getting data breaches like less regulated companies that don't hold "important" information in them.

So, back to the point: harder laws for software bugs will come a long way :)

Cheers!
 

It also goes a long way towards reducing the number of people and companies who can keep up with those requirements, reducing the often already limited competition and increasing costs.

Also, many of the largest data breaches in recent history are initiated by human error. In reasonably secure systems, the humans having access to it are the easiest way to bypass most security.
 
Status
Not open for further replies.