Ethics in software development

This is the written down talk I gave at TIAD.io on October 4th 2016.
Slides can be found here.

If you’re watching the TV series “Silicon Valley”, you might remember the signature gag from season one, shot at a real TechCrunch Disrupt event. Each of the presenting teams claimed they are making the world a better place, be it through Paxos algorithms for consensus protocols, or through canonical data models to communicate between endpoints. That episode stuck in my head as a sort of manifest on how detached from reality we sometimes get in our tech world. I’ve always been curious to figure out whether we actually believe that we’re making the world a better place. That’s what inspired me to look into the ethical aspects of our work.

I wanted to start by showing you two documents. As I was reading through literature and articles on the subject of ethics in software development, I started to notice a pattern. A code of ethics is something a variety of professions apply, and that includes engineers for example in the United States. But for a long time I couldn’t find any code of ethics for software engineers or software developers. Doctors – yes, civil engineers – yes; public sector, generally – yes; business ethics – yes; codes of conduct for employees, generally – yes; but software engineers – no.

I wanted to show you a Preamble to such an engineering code of ethics. This one was written by the National Society of Professional Engineers, which is an organization created in the United States of America. The Preamble goes like this (emphasis mine):

Engineering is an important and learned profession. As members of this profession, engineers are expected to exhibit the highest standards of honesty and integrity. Engineering has a direct and vital impact on the quality of life for all people. Accordingly, the services provided by engineers require honesty, impartiality, fairness, and equity, and must be dedicated to the protection of the public health, safety, and welfare. Engineers must perform under a standard of professional behavior that requires adherence to the highest principles of ethical conduct.

I find this Preamble to be refreshingly idealistic – in a good way. This Code of Ethics was written by – and for – civil engineers; people who – as we read on later in the document – hold paramount the safety, health and welfare of the public. We cannot argue against such a statement. Engineers build bridges, design cars and medical equipment. They often work for the public sector. The way they carry out their duties has (in)direct impact on the safety and health of everyone in this auditorium and obviously far beyond it.

Are we – as software engineers – very different? Well… I work in a company that produces, putting it in very simplistic terms, text-editing software. Our clients include Facebook, PayPal, Lenovo, and so on, and our software helps them – once again, simplifying – to create a high-quality marketing message or documentation that’s easy to understand. But in this auditorium there are people working with the Internet of Things; people working for medical companies; authors of software that runs in connected cars; people who write, test and administer web applications that collect sensitive data on its users. The way we work has impact on the safety, health, and welfare of the public. The fact that most of us probably work in the private sector doesn’t change that.

After some time, I stumbled upon an engineering code of ethics specifically for software engineers. Created years ago by two North American organizations –  the Institute of Electrical and Electronics Engineers and the Association for Computing Machinery – it reads quite like the code of ethics for civil engineers. Their Software Engineering Code of Ethics and Professional Practice documents the ethical and professional obligations of software engineers. It talks about different areas of a software engineer’s work and tries to outline best practices, touching on areas such as public interest, the relationship with clients and employers, working with colleagues, self-development, and so on.

I highly encourage you all to read both of those Codes and think about how does your work fit in. When we think of moving fast and breaking things and our dynamic world of startups, VC funding, and people throwing money at us for nothing, it’s hard not to smirk when reading fragments of the ethical code such as:

1.03. Approve software only if they have a well-founded belief that it is safe, meets specifications, passes appropriate tests, and does not diminish quality of life, diminish privacy, or harm the environment. The ultimate effect of the work should be to the public good.

or

3.15. Treat all forms of software maintenance with the same professionalism as new development.

Ethics: a 90 seconds crash course

What is ethics anyway? What does it mean to act in an ethical way? Whole books have been written on this subject and the evolution of ethical thought is centuries old. Going into detail is beyond the scope of what we’re doing here, but I think it’s a good general rule to define terms we are going to discuss.

Businessdictionary.com defines “ethical behavior” as:

Acting in ways consistent with what society and individuals typically think are good values. Ethical behavior tends to be good for business and involves demonstrating respect for key moral principles that include honesty, fairness, equality, dignity, diversity and individual rights.

While dictionary.com states that when something is “ethical”, it means that it is:

in accordance with the rules or standards for right conduct or practice, especially the standards of a profession

Now that we’ve got that defined…

Issues I will not talk about

Since every human action can be subject to an ethical evaluation and we only have 45 minutes, I will pick and choose issues per my subjective taste. I wanted to mention a few things I will not talk about, but which are still important to at least think about in the context of your company.

In their paper called “Ethical Issues in Software Development”, Ron Garrett and Jennifer Lewis mention a series of unethical behaviors, for example:

Using open-source code in their [company] code without properly crediting the source

Using illegal software to perform their [company] tasks

Reverse engineering code to find out how a process works (…)

Taking talent from the competition

The list of things one can ethically assess in a programmer’s workplace is longer than a Christmas wishlist of a six-year-old. I encourage you all to question and explore: should I credit the author of a code snippet I found on StackOverflow? Is it OK for my managers to aggressively recruit from our competition and how will that influence the way my team members interact with each other? What should I do when there is a licensing conflict between libraries I want to use?

What we build

A few years ago a team of developers published a mobile app that was supposed to help you meet new people (“girls”). You might remember the app called Girls Around Me. You might also not, since it’s not the first, nor the last app that will make you lose a little bit of faith in humanity. The app had a pretty high creepiness factor, as it allowed you to spy on people (“girls”) in your vicinity without them knowing. The app used Foursquare’s API to look for people (“girls”) who are checked-in around your area, then fetched their Facebook data including photos, real name, and their profile URL, and then allowed you to write them a Facebook message.

Imagine that you’re sitting in a coffee shop or enjoying a museum exhibition. You checked in on Foursquare because maybe there was a sort of discount involved for people who did. You’re having a nice time by yourself, when suddenly you get a Facebook message from someone you don’t know. You may have no idea how they look, you don’t know where they are, they just write you that they can see you and find you attractive and would you like to get a coffee. How would you feel? Would you be thrilled to meet them? Remember: they are using an app you have not signed up for, you have not consented to, you have not installed, and you would probably resent it if you learned about it.

According to a quote I found in a New York Times article, Foursquare wasn’t thrilled by what their spokesperson called a violation of our API policy and then shut off their API access. The app was removed from Apple’s iTunes App store and is not available for Android. But other dating apps like Tinder that rely on Facebook might also endanger their (for cultural reasons: mostly female) users, as some people will want to bypass the rule of only being able to write people who liked them back by stalking them on Facebook or Instagram. You might argue that this is what these women signed up for, but why would we accept that dating is supposed to be risky – and by the way, risky only for women? Is it fair that some of your users should be punished for the misbehavior of other users?

When you build apps with a purpose of connecting people, think long and hard about what to do to make your users feel safe. Consider all the data you’d like to acquire and whether it’s really necessary to collect that much. I’m pretty sure no one wants their products to cause pain and sorrows. Helping users feel secure – not only by letting them control what data they share, but also making sure you have clear and effective anti-harassment and anti-stalking policies in place – will actually make you positively stand out. The safety of your users is an ethical issue.

Whatever you build, keep in mind the words of Shannon Vallor, associate professor of philosophy at Santa Clara:

Users will always do things with technology that we didn’t anticipate

It’s impossible to anticipate everything, but as creators we are responsible for what we build and we should keep an open ear and an open mind. We’re software engineers, and as the software engineering code of ethics says, we should act consistently with the public interest. Technology moves fast, but not all of this movement is for the better and just as we shouldn’t be forced to create unethical products, users shouldn’t be forced to adapt and compromise.

How we build things

We build very different things. The richness of software, webapps, mobile apps that we create is mind-bending: technology gives us so much. We work in different settings: in small startups, in huge technology companies, in medium-sized firms. Our teams will differ: on-site, remote, occasional home office, full-time, part-time, freelance. Sometimes, if we work in the sharing economy, our software/services will provide work for others.

We (the technical workforce) are quite the privileged bunch: we get paid a lot, get perks, swag at conferences, travel budgets, ping-pong tables, free lunches, free alcohol in the office, flex time, you name it. We’re also still quite young people. For a lot of us, joining a startup will be one of the first job experiences.

Some of us will end up in companies that will use this lack of experience to foster a work culture of working long hours and never leaving the office. If you can eat, work, enjoy yourself, socialize and sometimes even sleep in the office, who needs life outside of it? It all looks great for (some) young, single people, but is not a work environment that older people, people with families, children, etc., will strive in. By fostering it, we are actively predefining that our teams will consist of people just like us: young, childless, party people. Those are character traits that have nothing to do with how good engineers we are and this process leads to exclusion.

I have the privilege of knowing IT professionals from quite a few countries and (at first) I was surprised to learn that some of them are in workers unions: IT professionals from the UK, Sweden, Norway, and Germany. Some of you will laugh: we are so privileged, why should we need workers unions? Well, they are helpful in making sure you do get a fair wage and that you do work in a healthy environment. They can support the less experienced employees; as well as people from groups underrepresented in tech who are at risk of getting worse job deals because of bias; they work with companies to make sure that the needs of employees are taken into account. Whatever your opinion on workers unions is, keep in mind that it’s most often not a war between the bosses and the workers, but a form of cooperation.

Think about it: What’s the situation in your company? Do you have a workers union? Do you have transparent rules on salaries? Is it acceptable to talk about your salaries with your coworkers, or does your work contract ban disclosing such information? Is your company culture friendly towards people with children, older people? How do you treat people underrepresented in tech? Are they employed in your company at all? Are you ok with how things are looking? What would you like to change? Are you truly happy with your company culture?

The sharing economy

In general, IT professionals can afford being freelancers: we get enough money to pay all our taxes and insurances by ourselves. It’s easy to forget not everyone would prefer to be a freelancer and some people work as one only because they have no other choice. The problem of diminishing workers rights is a major issue of the sharing economy. It is something Uber, Deliveroo and the like did not create, but are nonetheless propagating. All of those apps will push the message that they are helping people earn an extra buck and that they offer freedom and independence. But talking with people who work for those companies paints a slightly different picture.

There’s been a series of lawsuits in different countries (Germany, the UK, the US, etc) against companies like Uber, Homejoy, Try Caviar, Postmates, eCourier, City Sprint, and so on, filed by people freelancing for them. Drivers, delivery people, cleaners want to be recognized as regular employees. They want the right to sick leave, to holidays, to insurance – things you and me often take for granted. The apps they use boast freedom and independence, but at the same time they often have rules that are in open conflict with independence. For example, Homejoy (a sharing economy cleaning app) will not let you specify commute time or any other aspect of your job. Last I checked, that’s not how independent freelance looks like. Handy, another app providing cleaning services, would have all their cleaners be self-employed, but still ask them to wear branded clothing. Uber calls drivers partners, but makes demands as if partners were employees. You can’t have an apple and eat an apple.

People working in the sharing economy have the right to feel safe at work. Taxi companies will have security procedures in place and people on call to help with emergencies, while Uber is known for ignoring complains and the only thing you can do is send an e-mail afterwards. Cleaning professionals getting work through sharing economy apps don’t really get support or know what to do when they get assaulted on the job and – depending on the app – can’t even specify there is a customer they do not want to work for anymore.

Saying that all those people could just leave the work and find something they like doing more lacks perspective: if the people could find better jobs, they would. If you’re an immigrant in a country (and a significant number of self-employed sharing economy freelancers are), you face prejudice, bias, and legal obstacles when you want to work; it’s even worse when you don’t speak the language well.

Sharing economy apps have one big ethical flaw: they provide services that are cheap and more affordable, but the people who pay for the difference are the people who provide the services. It’s their lack of job security, of regular contracts, of insurance, and of stability that makes the offered services so cheap. Homejoy advertised itself as a movement to make cleaning services available to a broad audience, rather than a luxury for the rich – a pretty problematic quote that calls a living wage a luxury.

We might build apps with a specific target in mind (here: people who just want an extra buck), but we need to be aware of who really uses them. We might be technology companies, but we are still responsible for our end users. We should ask ourselves: is benefiting from other people’s hardships ethical? Would we want our friends to earn a living using sharing economy apps? Is treating de facto employees as though they weren’t, and not providing them the benefits they deserve, acting consistently with the public interest?

The safety and well-being of our end users

In 2012, someone stole a password of a Dropbox employee. This gave them access to their account, and subsequently to the database containing e-mails of Dropbox users. So Dropbox was cracked and the e-mails and encrypted passwords of over 68 million Dropbox users got into the wrong hands. The data was discovered online as up for sale years later during routine security work. Dropbox informed everyone and asked its users to change their passwords. Hacks happen. They’re painful, anger people and generate loss, but they do happen, and Dropbox introduced additional security measures, such as two-factor authentication. At the same time, Dropbox seems to coerce Mac OS users into giving their app elevated privileges, so it still has a long way to go on the ethical front.

There is a new set of data protection and cyber security laws passed in the EU only a few months ago. It changes the way hacks of this sort are to be handled. The set of laws that comes into force in 2018 obliges companies working in the “essential services” sectors to contact the authorities within 72 hours of an attack, or face fines. The customers are to be informed as soon as possible. These services include healthcare, baking, transport, energy, and more.

It is a good step towards transparency, but still there are no obligations towards the end users. Hacks are walking ethical dilemmas. To quote security expert Tom Cheesewright:

Do you risk if you announce early that you terrify people and actually the breach has been minimal, or do you do the forensics first, dig down through the systems, work out what has gone and then announce things once you’re more sure?

How early is early enough? How late is too late? It is a conflict between the needs of multiple forces. Your users will always be the one most vulnerable. Yes, they signed your Terms of Service, accepted all the rules and – at least in theory – knew what they were signing up for. But in practice, their interest is somewhat pushed to the back. As the saying goes, If you are not paying for the product, you are the product. It’s tempting to say, my users ticked the box, they can’t complain, but could we have a little bit more empathy?

People don’t understand software. They don’t read Terms of Service. They often don’t understand what data is being collected. They don’t understand the risks. Because of that, they cannot make fully informed decisions. It’s not about laziness or about being stupid – it’s about how fast technology moves, how rich it is, how hard it is to catch up. I don’t know about you, but I have a hard time following all the new tools and technologies related to my field, and I consider myself tech-savvy.

Part of the fault lies in how our terms of service and privacy policies is written. It’s all very long and full of legalese. If we want our users to read our rules and understand them, if we want them to use our software in an informed way, we might want to work on our language. Have a look at how for example Clue – a period tracking app – deals with that. Their Privacy Policy explains what data is being collected, how and what is it used for in simple English, making it easy to understand and thus make an informed choice on whether to use it.

Making sure our users can make informed decisions about the use of our software is part of the software engineering code of ethics:

3.12. Work to develop software and related documents that respect the privacy of those who will be affected by that software

Conclusion

I showed you bits and pieces, seemingly jumping between different themes. This just proves how many aspects of our work have influence on public welfare. A conclusion should bring all of this together.

Technology doesn’t exist in a vacuum. The more advanced technology gets, the more influence it has on our lives – in ways we haven’t anticipated. Asking questions about the nature of our work and the ethical codes that follow trying to provide answers are inevitable, and that’s a good thing.

You don’t have to agree with everything I said, I don’t expect you to, and it’s OK. Regardless of that, if you’re to take out just one thing out of this meeting, is to start asking questions. We have different moral and ethical standards – what is important is that we start applying them to our work.

And remember that engineers are people who hold paramount the safety, health and welfare of the public. That includes us.

Huge thanks to Kim Tore Jensen for providing
valuable feedback when I was preparing this talk.
Advertisements