VW
blog.cleancoder.com I suppose you could make the argument that these programmers did not
know what they were doing. That they were simply given some specs, and
they implemented those specs, and didn't know that they were
accomplices in a case of massive fraud.
I think that argument is even more asinine than Michael Horn's. They
knew. And if they didn't know, they should have known. They had a
responsibility to know.
I agree with all the points in the article except for the point that the programmers should have known.For me it is a plausible scenario that the programmers have been told that his feature is needed for some good reason (probably testing).
When I was a young engineer I had a mentor. He was a war baby and a strict pacifist. He was also very good and his advice was much sought after so he could afford to refuse all offers from the defense industry.
He once told me that for his whole life he manged to never designed anything that could be used to harm people - except for one thing. When he was young he was hired to design a gear rim for a crane. He told me, he was given the load specifications but never saw a drawing of the actual crane. That was a bit unusual but nothing he worried about.
It turned out that the gear rim was actually for a Howitzer. He never worked for that client again.
There are all kinds of reasons why a car has to behave differently while on a dynamometer and there are all kinds of special code branches that are executed only during test. For the programmers it probably was just another special case among many.
Don't be evil and don't be a fool, but you can't be expected to do a full ethics check for every feature you are supposed to implement.
EDIT: Spelling, style and removal of some superfluous chatter.
From what I've read, the trigger for the mode switch was very detailed and narrowly tailored to the EPA certification testing, and included barometric pressure as a factor.
That makes it quite a bit harder to believe that whoever implemented it thought it was for some legitimate testing. For testing you want a trigger that is hard for anyone to hit accidentally, but easy for people who know about it to hit. You would not include barometric pressure, because that narrows the ability to get into the test mode way too much.
An ideal sequence would be some nonsensical sequence of inputs, like a specific sequence of left and right steering inputs, with a specific sequence of turn signals (often opposite of the direction turned) if the ECU has turn signal data available, interleaved with a specific pattern of taps on the brakes.
I don't doubt that the developers where told to specifically write code to detect the EPA certification testing environment. I just think it is a possible scenario that they were left in dark about the real reason for the detection to exist in the first place.
Hypothetical example: The Lane Keeping Assistant can actively adjust steering. Turing the wheels during a test on the dynamometer can make the car jump off the rolls and harm people. The dynamometer is a highly artificial environment that can potentially confuse the Lane Keeping Assistant.
Do you ensure safety through testing guidelines or through safety measures in code? Would this be a plausible reason for a developer to write the dynamometer testing environment detection code?
"There are all kinds of reasons why a car has to behave differently while on a dynamometer and there are all kinds of special code branches that are executed only during test. For the programmers it probably was just another special case among many."
I think the point is that someone deliberately did this and they had their hands in the code. Yes, there are variants of the tune-able parameters for various regions and tests. As part of design and validation these can be used interchangeable on the test beds. However, someone, somewhere wrote the emissions defeator.
remember, someone wrote the emissions defeatos, AND maintained the code for seven years until now.
It may not have been actual code. Perhaps only values in a look-up table? Perhaps even values in an area of a multi-dimensional array that was never expected to be used.
The tune-able parameters are lookup tables. I know that Ford, for example, will go through up to 15 iterations per day with testers in the field, in various physical environment, leading up to a release.
The code that detected whether the vehicle was on the road or being tested is what is at issue here. It was a software engineer/manager that implemented it.
I find his assertion that the programmers "should have known" to be beyond naive.
There's a reason that teams designing these sorts of systems consult with lawyers who are experts on the relevant law. The programmer's job is to program. Expecting them to also deal with details of legality and morality (beyond grossly obvious things like hard coding dosage limits into medical equipment) is just wishful thinking.
That's the kind of talk that people want to hear. "Oh the developers were given shitty instructions, they shouldn't have listened" but talk is cheap. Stop to consider the implications of that sort of second guessing. Obviously things get wacky at both extremes but when you give someone a spec to meet you need to have an expectation that it will meet that spec. Our industry is built upon millions of black boxes that meet I/O spec sure having the developers turn around and say "we changed you spec because it was killing polar bears" comes with a much larger can of worms than just implementing what you're told to implement and accepting that it might not be morally agreeable and getting on to the next thing.
There's a reason people aren't all generic worker bees. It's efficient to have the lawyers worry about laws, coders worry about code and managers act as the interface between them and accept the blame if what the lawyers say isn't properly translated into the programmers' instructions. than it is to have all three groups worry about all three subjects.
I think law is interesting and has a lot in common with software developing but I don't want to have to go looking up case law as required research before coding a windshield wiper controller..
The programmer's job is to program. Expecting them to also deal with details of legality and morality (beyond grossly obvious things like hard coding dosage limits into medical equipment) is just wishful thinking.
We're humans, not robots. People can be expected to think about things and participate in society. It's generally held that we should expect pretty much everyone to concern themselves with details of legality and morality as part of being a good citizen... "I'm just a simple automaton doing what I'm told" is generally not a valid excuse.
Do you actually know programmers who literally just take specs and implement them and have no thoughts or opinions about the larger context of what's going on? In my experience, programmers have a lot to say about non-programming aspects of work.
The other issue here is, what constitutes "grossly obvious?" You just drew a totally arbitrary line based on your own opinion of what can be expected and what can't. Your argument is a bit of a strawman, nobody is expecting coders to go read up on case law.
Ultimately, we don't know anything about what happened at VW. We don't know who was responsible, or who knew what, and we're all just crafting up scenarios and speculation ("you see, the specs were such that the engineers couldn't possibly have known what was going on") based on our own experiences and biases.
Do you actually know programmers who literally just take specs and implement them and have no thoughts or opinions about the larger context of what's going on?
Sadly, yes. I've found this to be the case with most outsourced developers I've managed. They follow the spec to the T even if there's a glaring issue staring them in the face.
> The programmer's job is to program...
I think an engineer has more responsibility than following orders. Especially a German engineer should be aware of this. "Ich habe es nicht gewüst" is only an excuse as long as it is true.
"Ich habe es nicht gewusst".
Also, I fully agree. There is well-known historic precedent for "I just followed orders" not to be a valid excuse. Plus, the consequences would likely not have been even remotely close to that for a soldier in WW2 (i.e. unlikely to be shot for treason).
OTOH using the programmers as scapegoats is wrong. Yes, those who carry out the orders are guilty. But the entire chain of command that led to it is even more guilty. And thanks to corruption, they'll likely only feel a fraction of the punishment the scapegoats will face.
Writing the software is one act. Being the one to greenlight taking this software and putting it into machines that will be sold to end customers is another.
I can see many reasons why software might be written, or maybe even configured, in a way that could be lethal when deployed to an actual customer, but have completely valid and sane reasons for existing (all maner of testing comes to mind).
Unless it can be proven that the developers had intent and did follow through, there is no particular reason why the blame should fall entirely on them.
Additionally, if he is so intent on having a "profession" that punishes illdoers, he should first call for one that protects good members.
It is absolutely plausible that the programmers had no idea.
From the excellent Metafilter thread:
> i mean, how do the product managers rationalize this feature to their colleagues? what to they write in the spec that isn't all-out incriminating?
Modularity
Department 1:
Req 1: Software should enable emissions controls upon receipt of control signal A.
Req 2: Software should disable emissions controls upon receipt of control signal B.
Department 2:
Req 1: if epa testing device is detected send signal A.
Req 2: if epa testing device is not detected send signal B
http://www.metafilter.com/153117/EPA-Accuses-VW-of-Emissions...
Still seems kind of fishy, at least without some stated reason why it needs to know it's being tested.
See this comment from weinzierl, which has an example of a system that needs to behave differently while in a test environment:
The offending code could be obfuscated through all manner of requirements documents.
Without being able to see the actual code and requirement documents, all claims about them are pure, idle speculation.
Just because it’s possible doesn’t mean that it’s plausible or even likely.
A wise future programmer might want to ensure that the software he writes acts mostly like science――it can be used for good and it can be used for bad but isn't inherently neither――and will thus force outsourcing decisions about the final product to someone else.
Maybe it's ok in the Volkswagen software to have a knob that controls the amount of NOX in the exhaust, for testing purposes and for adapting the car to various markets. Maybe it's ok for the software to provide heuristics for the driving conditions (highway, city, dynamometer) for some future telemetry application. But the wise future programmer does realize it needs to be someone else than himself who makes the decision to configure the system to couple those two things together, and make the car reduce pollution only when dynamometer mode is active.
Good old shifting of blame works for the bad guys as well as the good guys. It may not be pretty but it works well enough if only you're willing to draw the line of responsibility somewhere for yourself.
> It doesn't matter that their bosses told them to do it. They did it.
It's not that easy. Sure, what they done can be considered "evil"... but what if they had refused to do it? They would have most likely lost their jobs and they would have no chance in a court trial. Volkswagen has a army of lawyers and is in tightly connected with every relevant government agency in Germany.
Exactly, Uncle Bob is in the category of people who can walk away if it suits them (he's got fuck you reputation if not fuck you money). So are (some of) the bosses. The developers weren't.
The problem here is Uncle Bob thinks he's in the trenches when in fact he's armchair quarterbacking.
This is all very theoretical, since we don't know the situation.
But the other possibility would be to "blow the whistle" anonymously.
The chances of getting away with that still aren't great (if VW put some effort into flushing out the snitch, I think only a practised liar could get through it...), but it's another way forward.
And actually: it's possible this actually happened, and the official story of how this was discovered is just a cover for an anonymous engineer who managed to get a warning to the right person.
There is some evidence indicating that there's corruption at play, not just within Volkswagen but also in the lobbies and government. Blowing the whistle would likely put them in a similar situation to Snowden (i.e. unable to take the legally blessed route, having to go directly to the media, denying them any protections otherwise granted to whistle blowers), legality-wise.
Even if everything in the article were right, I disagree with the conclusion. I'm glad there we have no "profession" to act as another gatekeeper for people to do things. The negative consequences of such "licensing" system (which will probably transform into a political pact soon enough after introduction) are more profound than the cost of its nonexistence.
The vast majority of people working on the Manhattan Project had no idea they were building a bomb.
What says the programmers knew?
It's conjecture at this point to say anyone who programmed the ECU to do this knew what they were doing.
Saying that, it is likely they did know but this comes from above. There's a few psychology experiments that show many humans will do things they know are wrong or immoral when an authority figure tells them to do it even though they don't want to do it. The Milgram Experiment, for instance, comes to this conclusion, among others.
Peer pressure and obedience of authority are real phenomenons and that starts with the leadership that needs to be held accountable. Hearing an authority figure pass the blame to someone at the bottom is disgusting and barking up the wrong tree I believe.
Is that the experiment that showed that "assholes" are the ones who wouldn't do what they were told if they found it immoral?
While the "nice" and "obedient" always did as they were told? Then we're told never to hire "assholes".
I am not a psychologist, nor did I even take 101 in college, but I don't know if the Milgram Experiment came to a conclusion about perceived personality types. I did have a passing interest in these types of experiments for a few hours and read overviews and conclusions to them and I believe I recall "agreeableness" being something humans strive for to get accepted into a group. The Asch experiment in the 1950's showed people will purposely give the wrong answer if others in the group gave the wrong answer to a basic perception question. [1]
"Assholes", in your context would be people that don't fit the "culture" of the companies "group". Possibly off topic, but it's one of the reasons I always get a bit nervous when companies define their "culture".
When Goldman Sachs blamed their enormous options pricing fiasco on software engineers in 2013, it started the great brain drain of the last two years.
Great brain drain? Meaning software engineers are leaving banks in large numbers?
"The public has been made aware that programmers can be culprits. This will make it more likely that the next time something goes wrong -- a plane crash, a fire, a flood -- that the public will jump to the conclusion that some programmer caused it. Yes, this is a stretch; but it wasn't so long ago that the concept of programmer implication in disasters was non-existent."
"...it wasn't so long ago..." What?
https://en.wikipedia.org/wiki/Therac-25 -- This has been a thing since at least 1985 and probably far longer.
Exactly. There is an existing legal and moral framework for precisely these sorts of situations. It seems to work very well for the emissions scandal too, so why mess with it.
Even if the programmer was fully aware of what they were doing, VW would still be the only party that's legally and morally responsible for this.
(And may God help whoever made the Therac-25 mistake, just imagine making a bug like that)
What is stopping a company from hiring a contractor to code the illegal parts to thereby insulate them from responsibility? This happens with oil and gas disasters as discussed on John Oliver https://www.youtube.com/watch?v=jYusNNldesc
Because many contractors are too smart to do that kind of dirty work. Contractors are liable for harm they cause. I've refused contracting jobs because of liability.
Sometimes I would not mind if developing software would need a license, even if I'm speaking against myself ATM.
The trouble with licensing a profession like software development is that no-one really knows how to do it very well yet. It's far too young and diverse an industry to have that level of experience and consensus.
Lacking more objective standards, the most likely result of attempting to regulate at this stage seems to be regulators who talk a good talk -- such as the author of this article. Those people will not necessarily be the ones with either the best ideas currently available for building good software or the most useful experience and/or data to advance the state of the art in the future.
I sometimes work on software that really does have to behave properly because significant failures in production really could be very damaging. The idea that some of the careful, successful processes used on some of those projects might be required by regulation/legislation to give way to the kind of junk that a lot of consultants peddle is quite scary.
The trouble with licensing a profession like software development is that no-one really knows how to do it very well yet. It's far too young and diverse an industry to have that level of experience and consensus.
Only artificially so. Modern software development is mainly about re-inventing wheels from the 1970s with slightly different syntax and more bugs in. If we had settled on a language - doesn't matter what, Ada, ML, C, Lisp, FORTRAN - they're all Turing-complete after all - and gotten on with y'know actually building things, software engineering would be a mature discipline by now. Instead all the accumulated experience gets chucked out the window everytime fashion changes.
Luckily we settled on C and we've all seen how bug-free that turned out to be /s
If C was the only language, then a vast body of experience would exist in it and no-one would be writing new code with off-by-one errors in it. What we have right now is everyone "knows" many languages, but actually has little experience with each one. And given the chance to use a new language, we jump at it, even tho' we all know better.
Given the vast body of experience we do have with using C and given that we are in fact still writing new code with off-by-one errors in it, I'm not sure this argument holds much water. There are plenty of developers who have decades of professional experience programming in C, and some of those people are really smart and make software that millions of people depend on, and they still make those mistakes. A bad workman might always blame his tools, but even a good workman will do a better job with better tools.
No kidding. If a team of Doctors, Lawyers or Professional Engineers (Civil, Mechanical, etc.) had been involved in a ethics disaster of similar scale, those people would be in danger of loosing the ability to continue practicing at a professional level.
There is almost certainly a PE that is responsible for the emissions systems in these cars.
That doesn't mean they are going to be the only person at the company that is responsible, but they signed off on it and are responsible for it in the context you speak of.
While the ethics behind the implementation might have been obvious, the actual legality of is not. The law could very well have been written in a way that makes this completely legal.
A software engineer would most likely not know the intricate details of the law required to know whether it was legal(in all nations) or not.
Professions are tools workers use to increase their bargaining power over employers and put up barriers to entry into said profession. See doctors.