> Anybody who’s seen the systems inside a major tech company knows this is true. Or a minor tech company. Or the insides of any product with a software component.
Anybody who's seen the sun rising east and setting west knows that this is true: the Sun rotates around the Earth.
The other, less-obvious alternative, is that software is NOT crap, we just like to complain a lot. And it's just so easy & fun to blame everything on the software we work with.
Most software that doesn't immediately die does its job fairly well. In fact, better than whatever alternative that existed before said software.
Well we would expect no different from medicine, or home construction, or automobiles without some basic guarantees:
1. High visibility of problems (people die, building collapse, crashes)
2. Regulation and inspection (some standards for evaluation and lines that should not be crossed)
3. Certification (you wouldn't let an uncertified random person do heart surgery on you, or build your house).
But for this to be possible, we as a whole would have to agree these are priorities and impose these limits. We're still in the time of the "surgery without anasthesia" or "no building codes" or "just let the factories emit whatever they want". Understandable due to the short time that software has been really "important" in day-to-day life, but now its time to slow down and make sure we do it right.
More precise understanding of faults and how to deal with them can go a long way before suggesting formal verification. Bug is not a fault, but it may cause faults though. You can try to find this bug beforehand or you can deal with the faults at runtime. In practice not just bugs cause faults, but also hardware failures, natural disasters, human mistakes and so on, so you kind of have to deal with faults either way.
> But we can certainly arrange to radically limit the scope of damage available to any particular piece of crap, which should vastly reduce systemic crappiness.
Now, this is the big idea behind supervision trees. Where you split everything into the smallest possible isolated processes and supervisors that watch over them, so that in the event of any process failing it can just get restarted, limiting the scope of the problem to only that one tiny process for the shortest possible time. This idea might even reduce the cost of software development compared to some more popular software development practices. But it does require an easy to use actor model in the language.
This is cultural. The Walmart-ification of the worlds products. Planned Obsolescence. The short term profits to get the stock prices up by the end of the quarter mentality. The credit card mindset.
How fast society moves, and how cutthroat capitalism is becoming...its all about short term gain because the future feels chaotic and unstable to most of the world.
I personally think its the fact the money is getting more an more difficult to obtain because more and more is being hoarded in the upper strata of society. So short term mindsets on financial gain take over because financial opportunities don't come easily.
This is reflected in software development.
TFA mentions Google's Caja and Secure EcmaScript (SES). Does anybody know if that's still in development or under consideration by ECMA or others?
“The reason this stuff is crap is far more basic. It’s because better-than-crap costs a lot more, and crap is usually sufficient.
... ... ...
Every dollar put into making software less crappy can’t be spent on other things we might also want, the list of which is basically endless.“
Software is crap because humans on the whole have an extremely difficult time reasoning through all the possible logic flows.
Software is also crap because in order to design good software a lot of insight, experience and empathy is required, and as it turns out, people writing crap don’t have any of those prerequisites.
Finally, most software is crap because most people writing it don’t care about getting it 100% correct, since it turns out that getting the last 5% correctly functioning is exponentially difficult, and those people would rather collect a paycheck than feel proud and content about what they wrote.
All other industries manage to deliver complete, working products; we are the only industry which doesn’t; computers and software never work 100% correctly. What does that make us? It makes us jerk-offs, that’s what it makes us. I understand now why Keith Wesolowski ditched computers and went to work on a ranch. And damn it, he is right!
We have some formal verification, it's called typed systems. Very limited, I know. But some people are still praising dynamically typed languages for their "speed" and "lack of compiler errors".
Like anything else, you just don't see it.
I thought you have a point until you mentioned Microsoft. The company that showed us it's perfectly fine and socially accepted to make top money from crap and not releasing security updates, and be proud of it. Thanks MS!
And then, hardware is crap too...
>And still, this wouldn’t be so bad, if the crap wasn’t starting to seep into things that actually matter.
>A leading indicator of what’s to come is the state of computer security. We’re seeing an alarming rise in big security breaches, each more epic than the next, to the point where they’re hardly news any more. Target has 70 million customers’ credit card and identity information stolen. Oh no! Security clearance and personal data for 21.5 million federal employees is taken from the Office of Personnel Management. ...
I’m not a fan of this type of article, the article where we find a group of bad things and then go find something to blame for it. It’s easy to say all this stuff was caused because “we just don’t make stuff like we used to”, but I think it’s wrong-headed.
Or more probably it’s used in the wrong sense. It’s true that we don’t make things like we used to. We have made giant leaps past anything we’ve ever done before. We have hundreds of thousands of companies online and millions of devices online, a level of interconnectedness that the planet has never seen before. We also have more accessible software development tools and many more people writing software than ever before.
Increasing numbers of security breaches are a sign of our progress not a sign of our failures. It’s just a sign that there is a lot more stuff attached to the internet and that unfortunately includes people with the intention to commit crimes.
Should we tolerate security breaches? No. Should we celebrate them? Of course not.
But let’s see them for what they are and search for a solution instead of something to vilify.