Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Note the punch line at the end:

> Was it worth it?

> The marketplace seems to have answered this question: not at current prices. See [Lipner15].

Note that, unlike our discussion below, this is about the cost of implementing B2 security in an OS, not about the sale price of the OS (though the two are related).

Or perhaps I should say, the sale price of the OS is forced so high by the cost of B2 that the market isn't interested in the OS, but this would be true whether Unix was free or not.



And now we have governments stepping in to fix past decisions regarding programming culture.


I'm not sure you're still in the conversation after all this time, but if you are:

Back in the 1950s and 60s, there was the "software crisis", which was that we could not produce all the software that we felt we needed. Now we have governments, not "stepping in" exactly, but at least making recommendations. We wrote all this software, but it was imperfect in ways that cause real problems. But producing B2 software took too long and cost too much - we'd have much less software if we did it that way. (You could argue that the world would be better off; I'm assuming that it would not be.)

So I have a question (an honest question, not a "gotcha"): In your view, where should the balance be? You think C is too insecure. Fine; I won't say there's no evidence for that view. How far should we go? Java? (No buffer overflows.) Formal correctness? B2? A1?

Software that is more correct and secure is good. More software (or at least more efficiency in creating software) is also good. Where do you put the balance?


The balance should be like in any profession that affects society, there is a minimum of quality and certification, before whatever is done in a garage or home kitchen is allowed to be consumed by others.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: