Jamie Barnard is Unilever’s general counsel for global marketing and media, and chair of the Data Ethics Board at the World Federation of Advertisers (WFA), where he led the creation of the report, Data Ethics: The Rise of Morality in Technology. He spoke with Google’s Pedro Pina, VP of global client and agency solutions, to discuss how today’s leaders should be thinking about data ethics in our digital society.
Pedro Pina, Google: When it comes to issues surrounding online privacy and data regulation, you told me once that brands should be “courageous,” and this is not the time for “being cautious, compliant, or holding back.” But that’s exactly how most people think about this topic. Compliance with GDPR and other data regulations is key. Tell me more about your mindset. How should people be thinking about this?
Jamie Barnard, Unilever: Pioneering companies are looking beyond compliance, as tough as that sounds. Data compliance is a baseline, protecting people’s fundamental human rights on the one hand, and shielding companies from the sharp end of the law on the other. But it’s of limited value in the court of public opinion. If people think a company’s data practices are unethical, then a demonstration of legal compliance will not protect their reputation. This is why brands have to set and follow a code of ethics.
There are some amazing companies that people trust because they lead with ethics and integrity. And sometimes it’s the small things that count.
Here’s a good example: Some years ago, I downloaded a social media app. On registering, a pop-up appeared asking for access to my contacts.
Ordinarily, alarm bells would ring, but they made their intentions crystal clear. My contacts would be encrypted and only used to connect me to existing friends using the app. Then my data would be deleted permanently. This was all the reassurance I needed. I’ve never forgotten their respect for online data privacy and their open, transparent approach.
Demonstrating your commitment to ethical principles is courageous and will ultimately build trust.
Demonstrating your commitment to ethical principles is courageous and will ultimately build trust.
Pina: Privacy isn’t often viewed as the most galvanizing of topics. It’s important, but it’s not seen as exciting. How do you motivate your colleagues, and the wider corporate world, to see data ethics as a game-changer?
Barnard: The excitement comes from cracking the tension between data-driven innovation and people’s expectations of privacy. The excitement comes from pushing the technical barriers of what can be done without compromising your commitment to people’s safety, privacy, and well-being.
If you want to be progressive and do something no one has done before — and you want to do it without compromizing your values — then you’re raising the bar for yourself. When you see privacy as a fundamental right, then protecting it becomes profoundly rewarding.
Take artificial intelligence (AI), for example. A company might design an AI system with the best of intentions. In the beginning, it may work perfectly, but after six months, engineers may notice the system is making decisions with unintended consequences. Identifying and solving complex challenges like this is exciting. Finding ways to reengineer morals and ethics back into intelligent systems is what drives a lot of people. The joy comes from the challenge.
Pina: At Google, we’re excited about the pioneering work we’re doing with the industry on the Privacy Sandbox and giving people more control of their data. What are the key hurdles that keep executives and leaders from making necessary changes in this space? How can they address these ethical issues?
Barnard: To build a healthy system, we need to make interventions to address ethical dilemmas before they become issues. With this in mind, there are three things to think about.
Firstly, we must acknowledge our missed areas. When awareness of ethical risks is low, we miss things. We need to train ourselves to spot these risks and address them.
For example, if an engineer creates a system to optimize click-through rates, they may not notice when the algorithm’s decisions start having a negative impact on diversity or start to exclude people. If you’re actively looking for this, then you can intervene and you can adapt how to measure success in a more inclusive way. This is a shared responsibility, but it starts at the design phase. From a management perspective, it’s beneficial to introduce training to help your teams consciously look out for ethical risks.
However, this requires full visibility of the end-to-end flow of data. It can be difficult to identify risks and ensure governance frameworks are in place if part of the data flow is unknown (usually because it starts or finishes outside your organization). That’s why it’s so important to work with transparent companies that share your values and ethical principles.
To build a healthy system, we need to make interventions to address ethical dilemmas before they become issues.
The second hurdle is accountability. Sometimes an individual or team may spot an ethical problem that stems from a legacy system or process. In an ideal world, people would flag the issue and address it. But, of course, it can be difficult to unravel long-running systems. People may not feel personally responsible for finding a solution to a problem they didn’t create. So, it comes back to shared accountability, and communicating that everyone has a collective responsibility to grab the nettle and address any problems.
The third hurdle is a lack of psychological safety. As business leaders, we want people to lean in and raise their hands when they spot potential ethical issues. But a lot of people, particularly younger workers under pressure to deliver, may not want to speak out. They may not want to create friction or slow down processes, which could reflect badly on them. Leaders need to ensure these people feel safe to point out areas of concern without fear of reprisal, even if it's only a hunch that something is wrong. Otherwise, problems persist and can get worse over time.
Pina: Your WFA report was the world’s first guide on data ethics for brands. How did you end up writing it, and how has your thinking changed since?
Barnard: The WFA report was a product of the WFA Data Ethics Board. As responsible advertisers, we were trying to drive transparency and build trust. The problem was (and still is) that data use is profoundly complex, and the heavy burden of decoding it is placed on the consumer. The privacy notice is not a user-friendly solution to the problem.
The board agreed that ethics was a useful antidote. If we can embrace ethics as an industry, this will reassure people and build trust and credibility.
The WFA report was a way of telling the industry that we have to change the narrative from talking about privacy as a risk factor to talking about privacy and data ethics as a positive step toward a healthy digital society.
We have to change the narrative from talking about privacy as a risk factor to talking about privacy and data ethics as a positive step toward a healthy digital society.
For me, the most uplifting aspect of data ethics is the impact it can have on driving diversity and inclusivity. Positive interventions can eliminate bias, include the excluded, and celebrate our differences, which is precisely what digital should unlock.
Pina: Has this report changed your approach to privacy and data ethics at Unilever? Has this become a board-level topic and something that’s being discussed in the C-suite?
Barnard: The WFA Data Ethics Board mobilized in the spring of 2019, and the report reflects our collective thinking. The intention was always to help brands benefit from everything we have learned.
At Unilever, our initial focus on data ethics centered on ethical best practice in AI and machine learning. However, quite quickly, we expanded our scope to include data ethics more broadly.
In terms of support from our leadership, we were pushing against an open door. Our focus now is on putting ethical principles into practice. As well as helping to manage risk, we believe that the practical application of ethical principles will help people make smart, confident decisions at speed. This is what I meant when I said that this is not just about being cautious or compliant; it’s about going the extra mile, being enterprising and progressive, but without sacrificing your values.
It’s amazing to see how data ethics is capturing the imagination and determination of so many people. It certainly gives me hope for a healthy digital society and a positive future.
This is not just about being cautious or compliant; it’s about going the extra mile, being enterprising and progressive, but without sacrificing your values.
Pina: If people only take one thing from the WFA report, what should it be?
Barnard: There are actually two things, but they’re closely related. Firstly, we should think of ethics in the same way we think about sportsmanship. A good sport has scruples; they will do the right thing even if it requires sacrifice. Their commitment to honesty, integrity, and fair play means they work twice as hard as those who cut corners, but their success is all the richer for it.
We should think of ethics in the same way we think about sportsmanship.
Secondly, continuing with that theme, building trust is a team sport. We are all accountable for collecting and using data in a safe, ethical, and transparent manner. Embedding data ethics across the industry requires commitment, cooperation, and responsible leadership from advertisers, technology platforms, publishers, developers, and tech vendors alike. If we work together, we all win together.