Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think people seem to have a very romantic notion that there is some meeting where all the engineers get together and decide whether or not they are going to kill their users or not.


Not to "kill their users" but they realistically have meeting to vote on whether the system is "go/nogo". This is the basic structure of many aerospace designs. If you ever watch a rocket launch, they are literally checking in with every system owner for that decision. Prior to even getting to the launch date, there are meetings where they are checking with the designers for similar decisions. There are good documentaries/movies that show this dramatized in examples like the shuttle Challenger.


Yea and with challenger only one engineer raised the alarm while a large group of well meaning engineers overrode him.

This is reality. Decisions aren’t kill/not kill.


Yeah we both agree that the decision isn’t “kill/not kill”.

It’s much more ambiguous and murky, especially when dealing with low probability events. That’s why it takes expertise, but it doesn’t mean there shouldn’t be accountability for those “expert” decisions. Other engineering domains already have this, there isnt anything that makes these decision inherently different.


When you criminalize the role either no one will take the role or an incompetent patsy will. I dont know when HN became a bunch of keyboard jockies white knighting what they'd do... but reality is a very different story.


When you avoid accountablilty, you pervert the risk/benefit. So people get incentivized to roll the dice on low-risk/high severity events because most of the time, it will come out in their favor and they can continue cashing their checks. But in the off-times when it doesn't, people just shake their heads and say "Who could've known!?" when in reality, there were people sounding the alarm the entire time. The outcome you're advocating hiding the risk into the shadows where nobody talks about it. I'm advocating a structure that isn't risk-adverse necessarily, but at least lays it all out for transparent risk-informed decision making.

You're talking to someone who spent years working in the aerospace industry. I'm not sure, but it seems like you have some hypothetical idea of what reality is, but it doesn't align with my actual experience. Not to sound disrespectful, but it sounds like one of us actually has experience and the other is going off a narrative they've created in their head.

I do hear people often talk about the "incompetent patsy" excuse. But I'm curious, where do you think that stops being relevant? Do you not hold medical doctors liable for decisions because a patient will just find another "incompetent patsy" to prescribe them whatever they want? Do you not expect civil engineers to be responsible for a structural design of a bridge because a company will find an "incompetent patsy" to sign off on a sub-standard design that is better for profit margins? We're used to holding all kinds of professionals liable, but there seems to be a cultural shift in the last 40-50 years where we've rationalized bad behavior as the norm rather than holding people accountable.


Wait, what? I'm sure here we can relate to the software side of things. How many times have you been in a position of saying "go/nogo" where saying "nogo" didn't imply it would also be your last day on the job? (or at least a severe Career Limiting Move)


For a relatively brief period, I worked as a software quality engineer related to aerospace research and development. Practically every test that used software required a “go/nogo” decision before they were allowed to run any safety critical test. The safety chief would give the overall “go-nogo” decision but the subsystems (like software safety) would require “go-nogo” decisions to be fed through the chief.

As I’ve said elsewhere, good organizations implement a distinct chain of command for those decisions so they can be made more impartially. Even then, it’s not without career risk, but IMO that’s part of the gig and why it takes a certain amount of professional integrity. As someone else said, if someone isn’t up to that task, maybe developing safety critical software isn’t the right gig for them.


I was able to formaly state a "nogo" when I worked in banks. The management could bypass it, but it was then a formalized decision of their part and it discharged me of any liability for the problem that lead to my "nogo".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: