Blog post -

​Time for a Code of Ethics for the Technology Industry

Many professions are driven by a code of conduct or a code of ethics. The Medical industry is a great example, with the golden rule being “First, do no harm.” Medical ethics guide regulation and policy that govern how innovation is implemented. Ethics boards are there to ensure patient safety and deter historical moral failures, such as medical experimentation without patient consent. It also curbs advancement without considering outcomes. For example, just because we can clone a human being doesn’t mean we should. It’s not the ability that stops cloning. It’s the ethics.

The technology industry has no such code of ethics. But it’s time we did. Last month’s Facebook and Cambridge Analytica scandal highlighted how a lack of ethics can cause serious damage to societies, and even threaten democracies. But it doesn’t end there.

As technology creates self-driving cars, what is the code of ethics behind the car’s decision making? In the classic thought experiment, the Trolley Problem, one must decide if it’s better to kill one person or five. These kinds of ethical dilemmas must be programed into autonomous cars. Even if humans could solve the Trolley Problem, the decision may vary from culture to culture. In many countries, dogs are not as beloved as they are in most Western countries. Cultural norms impact moral decision-making.

As Artificial Intelligence (AI) and Machine Learning begin to drive many of our online experiences, ethics become even more important. Machine Learning evolves, using algorithms that pick the “best” outcome. But a human being must program that algorithm, and most human beings program with bias. For example, if AI is used in employment decisions, would it automatically eliminate women who are most likely to have a child in the coming years? Would it eliminate people with disabilities? If we do not think through these scenarios in advance, we leave open the possibility for unethical outcomes.

New facial recognition programs also give rise to ethical problems. Tracking human beings by their face means companies can track everywhere they go, who they meet and possibly assume patterns of association. For example, you may not be a criminal, but what if you happened to visit the same coffee shop as a criminal on more than one occasion? Would you be falsely profiled as a criminal?

The problem with the technology industry today is despite the number of intelligent minds driving the world’s greatest innovations, there are no great minds considering the consequences of those innovations or mitigating the risks. While Facebook may be the first global scandal we’ve seen that was born from a lack of ethics, these instances will only get worse if left unchecked.

Many have talked about government regulation as a solution, but most government legislators do not have the technical knowledge to fully comprehend the inherent function of many advanced technologies, much less the threats they could pose in the future. During the US Congressional Hearings with Mark Zuckerberg, the legislators’ lack of knowledge on basic technological functions and their risks became painfully apparent. Therefore, self-regulation through a code of ethics seems to be the optimal solution for the immediate future. Perhaps regulation will be needed in the future, like Europe’s GDPR, but regulation is often years behind the technological developments.

We are calling for a technology industry board made up of top computer science professionals from various countries and disciplines, such as the Electronic Frontier Foundation, to join with human rights experts and data protection advocates to develop a technology code of ethics. This group would also need to determine how to enforce the code of ethics. Perhaps the group could collaborate with the UN or ICSU to develop governance and enforcement options. The point is, no single entity has all the answers. We must come together to solve the ethics vacuum in technology or risk unwarranted and useless regulation attempts by those who do not possess the in-depth expertise that our own computer scientists do.

The technology industry has done many great things for humanity. We’ve also done some bad things. The time has come for us to step up and take responsibility for what we have created and will create. We need to rethink economic models that strip humans of their information and sell it without a thought for how that can come back to haunt us in the future. The time is overdue for a code of ethics that will guide how we develop and use technology.

Topics

  • New media

Contacts

Elizabeth Perry

Press contact Chief Marketing Officer Marketing & Communication

Related content