Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Don't put the blame on the developers. Too frequently they are put under huge pressure and cannot say "no" to their managers. There are people which are responsible for delivery, testing and they are paid way more than the average developer.

Unlike in engineering, with software you can - and indeed should - test it before deployment in thousands of ways.

Finally there is also a receiving party that should ensure they are not accepting broken software. They have responsibility too.



> Too frequently they are put under huge pressure and cannot say "no" to their managers.

Actually, this is precisely why something like the "Iron Ring", linked above, exists. Standards bodies like APEGA (https://www.apega.ca) exist precisely so that you have someone backing you if you say "no" to a manager. If you're an engineer and you put the public at harm in any way, APEGA will come after you. Part of being able to call yourself an engineer in Canada is the distinction that specifically you won't just say "yes" and bend over because your manager wills it. Your duty is first and foremost to the public.

This is something that works incredibly well for engineering, geology, and geoscientists, and I can't see why it won't work for software. Developers who are contracted to build systems for the public like this absolutely _should_ have liability here.


Software projects that have significant impact on public safety/well being are a relatively rare kind. You can easily apply additional policies on them - and indeed this is what is being done (see airplane software, medical software etc.).

But will you really call your union because your manager forces you to skip writing unit tests for each and every class you write? Having in mind you are just churning out some crappy e-commerce website?

Completely another thing is that you might be careful with your code, the third party components you are using are more lax. This is well recognized e.g. in medical software where any third party code must identified and held to the same standards as the software you write.

It is very difficult to say who is to blame for creating a certain software bugs. Is it the person that configures the software, becuase he chose invalid config parameters? Is it the intern that wrote invalid piece of parsing code? Is it the system architect who put wrong data into the specification?

For any given piece of software there are many people responsible in various ways for its creation. In such contexts you cannot be blaming individual persons especially that their intentions might be good. Even very careful programmers will make bugs.


> just churning out some crappy e-commerce website?

That can affect people's lives dramatically though. Off-by-1 error charges someone's debit card $2,000 instead of $200 and now they can't buy food. Ecomm is important.


This comes back to how software is designed and built. On it's own a single bad weld should never endanger a bridge, but we accept a single mistake can bring down software. At it's core that's why software fails so often AND that's what we need to fix.


On all the bridges we build every weld is non-destructively tested before the item is painted and shipped out.

Magnetic particle, and penetrant die testing.

I don't understand the term 'Software Engineer'. Still feels like the Wild West me.


A single bad weld is isolated, a single bug can propagate a lot of bad data. Software crashing isn't the same as a bridge crashing, in many cases crashing is the best thing that can happen.


We could do the same with computation. On truly mission critical software you see multiple independent implementations of critical functionality.


> But will you really call your union because your manager forces you to skip writing unit tests for each and every class you write? Having in mind you are just churning out some crappy e-commerce website?

Firstly engineering bodies are not a union. They are a legal organization that takes action to protect the public from actual harm. In the case of the software put forth in TFA, quantifiable harm was caused to people because of poorly tested and poorly integrated software.

Secondary to that, not every piece of code requires this level of oversight. As an example, you don't need to get a civil engineer to sign off so you can fix a picture frame on your wall; however, if you do decide to remove a load-bearing wall, you better be sure you need someone to sign that you're not collapsing your house, or selling it to someone else who won't know about the problem.

In the same way, the biggest restriction this creates for most people is that they won't be able to call themselves a software "engineer" and won't have the authority to sign off a project (in the same way you sign off that work on a bridge has been completed). This already exists in Canada and doesn't prevent anyone from creating a crappy e-commerce site. Protecting the word "engineer" is good for multiple reasons, and holding software to the same standards as we hold the rest of our public projects is not only a good idea in theory, but probably saves cost in the long run too. Think of all the people who will sue over being wrongfully arrested or put on the sex offenders registry.

> For any given piece of software there are many people responsible in various ways for its creation.

As with any large project. We don't blame the individual electrician and make them legally culpable if the light sockets start a house fire. However, if the sockets are faulty and the lead engineer _knew_ such, (or did not do enough due diligence such that they could not expect it to happen) yet still signed to let the construction finish and endangered someone who purchased the home, then the engineer is liable (to an extent for damages but more importantly that they will lose the right to work as an engineer in the future). The most salient point is that you create a code of responsibilities for engineers towards the public, and you build the legal framework so that projects that harm or endanger the public can be reigned in.

For projects like TFA, I can imagine that under an organization like APEGA, you could treat software similar to traditional projects by adding integration requirements to the contract, and requiring a certain period (90 - 120 days maybe?) of time after all the data has been transferred over to soft-test the system in production to see if it meets the on-site requirements (i.e. matches what is done manually). Does this make the contract more expensive? Yes. By having processes like these, can we be reasonably sure that we aren't going to cause a large amount of external harm or grief? Maybe. Building a culture where we don't just ship a proof-of-concept that can potentially imprison people because our manager had short term goals starts by making it clear who is liable, who is accountable, and where your responsibility lies.


>Don't put the blame on the developers. Too frequently they are put under huge pressure and cannot say "no" to their managers.

Befehl ist Befehl. Blaming management is literally the Nuremberg defence. If we feel that we have to do reckless and dangerous things in order to keep our jobs, then we desperately need to unionise. It's just not good enough for us to throw up our hands and blame the PHBs.

Management need to change, but they're not going to do it voluntarily. Pressure needs to come from users, but also from developers. We need to fight back against poor practices, as millions of workers have done in other industries. We're the guys on the shop floor, we're the people who know what the issues are and how to fix them. If we don't take a stand, nobody else will.

How would we react if this story was about a plane crash or a nuclear accident caused by negligence? Would we be so quick to absolve employees of responsibility, or would we be asking why nobody blew the whistle?


> Befehl ist Befehl. Blaming management is literally the Nuremberg defence.

This isn't really about shifting blame or absolving responsibility. It is everyone's responsibility to ensure that safety and the public good are put first. If developers don't have the ability to say no to their management because there's no institution backing up that decision then we are to blame.

I know it's a cynical world view but management will trade just about anything for profits and employees will compromise their principles for their livelihood. Hoping that this system of perverse incentives will correct itself internally using only the individual employee's right to resign far too hopeful for my taste.


> Pressure needs to come from users, but also from developers

It needs to come from government especially. It's a basic function of government to protect the public

> How would we react if this story was about a plane crash or a nuclear accident caused by negligence?

If there were no government laws penalizing this outcome and regulations preventing it, I'd be horrified.


There are plenty of regulations covering software used in aircraft and nuclear power control. Move on to the next moral panic.


Gee Bob, that did a helluva lot of good in preventing these folks from enduring undue legal hassles.


> Nuremberg defence

Except we're not just talking about clearly malicious or fraudulent software with sins of commission. The bigger issue is cutting corners, taking shortcuts, and individual weaknesses that break the whole system.


Successive and cumulative corner cutting. And management should provide oversight covering several steps. But often they close their eyes and the last developer who touched the code is getting the blame.


I would amend that to "don't put the blame solely on the developers". I've seen way too many developers whose mentality is "ship it, it seems good enough, we'll fix any bugs as we go".


If the company culture is supporting this attitude, then sure - you will have developers saying that. But it isn't the developers that create that culture. This is something the management is responsible for.


The fact that a person acted pursuant to order of of a superior does not relieve him from responsibility. This can be a difficult decision to make in these circumstances, but I just couldn't ship buggy code that may be responsible for someone being unfairly jailed.

That's why I try to always document as much as possible in the workplace.


People aren't computers, they don't just run the company culture program in their brains. Sure, developers are influenced by company culture, but they contribute to it as well.

At one end of the spectrum, you have junior devs, fresh out of school, highly susceptible to the effects of company culture. At the other, you have senior devs with more than a decade of experience, who do a great deal to set the tone and shape the culture.

Thank about what kind of cultural influence you get from a senior dev who has accumulated more than ten years of experience of moving fast and breaking things. That's a lesson that I found very hard to learn: just because someone has lots of experience and skill and knowledge, doesn't mean they are better developers.

Mind you, I'm not saying the blame for the state of our industry lies solely with devs. I agree with you that it's the management that has the last word, but that doesn't absolve the rest of us of all responsibility.


If you really care about it you can simply do "cultural screening": don't hire people that clearly don't match your attitude requirements. We are doing that.

You can also discourage certain behaviors. If needed, fire offenders.


Exactly. This is why bosses make the big bucks. Developers are just code-machines and they will perform poorly or well entirely dependent on their external environment and culture created by their management.

(Oh, it's a one way thing, you say? Bosses are to blame for failure, but devs get credit for successes? Head I win, tails you lose?)


Seems perfectly reasonable to me.


Even bringing developers (or anyone really) into it is like blaming the prisoners for having a dilemma. It's the nature of software that it does not fit inside one person's head, and this leads to problems.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: