Back To Top

Panel: Corporate Ethics and the role of Whistleblowers: AI DIET World 2021

``I always say that ethics has to start from the top down.``

- Shilpi Agarwal

“When they [Whatsapp] talk about end-to-end encryption you feel so secure, but the security level is as pathetic as one of us using the same password on different accounts”

– Susanna Raj

Expand

Talk Summary

In this Panel discussion members of the DataEthics4All Leadership team discuss Corporate Ethics and the Role of Whistleblowers in organizations in light of the recent WSJ leak from former Facebook product manager Frances Haugen on the harms it has caused and ignored.

 

 

0:01 Sam Wigglesworth:

Welcome to our first panel of the day, my first panel and the team’s first panel on corporate ethics; we’re going to be looking at the role of whistleblowers. I’m going to be moderating, I’m Sam and we have our panel which is our amazing leadership team; Kevin Dias is here, as is Bruce Hoffman and Susanna Raj. Really looking forward to it! The main discussion point today is the recent news about Frances Haugen’s report and testimony to Congress about her experience at Facebook and the interviews that she’s had with media outlets, and what our thoughts are on on the points that she raised. So, as a starter question to the panel: when just a few days ago we couldn’t access our Whatsapp chat, we couldn’t get on Instagram, we couldn’t update our Facebook profiles – we had a few hours out of social media; how did you feel when that happened?

1:30 Bruce Hoffman:

It’s kind of funny because in my case I would say it’s probably good for everyone to have a rest from social media, any possible time that can actually happen, but at the same time in thinking about that and listening to what’s been said, you realize it’s not just a rest from social media but it’s a break from people being able to do business; a break from people being able to communicate, because of all the dependencies on the platforms that Facebook owns. I think it was 2.8 billion people that are actually communicating across the multitude of platforms that went down. So that being down just made it feel like there’s too much all in one place and it’s almost a risk to world communications to have that be at a company level rather than some ownership beyond it, so it’s an interesting challenge that that we’ve put ourselves in, and then of course hearing about it, it almost feels kind of crazy that it’s that big and that much impact could really occur. Personally I didn’t even realize it was going on until somebody on the panel shared it with me, because I wasn’t on at that point!

2:54 Sam Wigglesworth:

Yeah, that’s a great point – we rely on it so much, it’s a significant number of people, it’s global and it affects us all. What are your thoughts then Kevin or Susanna? Are you happy with the fact that you didn’t necessarily have to access social media for a few hours or do you rely on it, for business for example?

3:17 Susanna Raj:

I don’t rely on it for business, but I realized how many of my family and friends that are from my native land (India) are using Whatsapp to connect with me, not my traditional phone number because that has an international cost, so they were all using Whatsapp and then to see that they’re not able to reach me and I was not able to reach them – I just realized how much I was relying on this free service without thinking that that could be hacked away from me! Maybe I am the product now – whenever something is free they say you are the product they are selling, so I just realized: okay, now I need to have a backup plan, something that is paid so that I know I can access it. And also to realize how insecure the whole process behind the scenes was when I was reading the stories; even though I could not understand the technical terminology used I still could see that it was so incompetent; that really shocked me. When they talk about end-to-end encryptions you feel so secure, but the security level is as pathetic as me using the same password on different accounts; they were doing the same thing at a different level. So that also made me feel less trusting of those mediums that I’m now using, especially Whatsapp. I did not even know Instagram or Facebook were down because I’m not even on those platforms, but I realized Whatsapp was down, and that’s what made me go and look at the news to see what was going on.

5:16 Kevin Dias:

I think if I could add to what was already said: we’ve put ourselves in this place where we’re so reliant on Whatsapp. I think it’s funny, Bruce’s dog (barking in the background) echoes all the sentiment of my extended family also back in India on that day for those five hours; I think everyone there uses Whatsapp, some of them use Facebook, not too many use Instagram. When this happened they had no way of reaching out to me, and we reach out to each other every day and there was a huge panic; they were like ‘what’s going on, is the internet down?’ That was a funny comment that my mom made – ‘is the internet down’, because that’s all they do on the internet, it’s just Whatsapp and Facebook. I remember a few months ago, I had actually tried to get them on other instant messaging apps like Telegram, and another comment now comes to mind that my mom and dad had made then: ‘do you really need another app, isn’t Whatsapp enough? It’s amazing, it’s so easy to use’ – all those kind of things, and now that I think about that, I’m realizing that Facebook has made it so attractive from a user experience perspective. Not a lot of people are willing to make the leap to change to a different app of any sort, and we all know that there are different instant messaging apps that are out there but most people prefer Facebook and Whatsapp.

7:01 Sam Wigglesworth:

Yeah, definitely. We do rely on it as chat tool and also for sharing images and audio, particularly in a country like India where there’s almost a dependence on it. There are alternatives out there too, but sometimes we rely on one application, one platform that potentially puts us at risk of not being able to communicate with our family and friends, if you’re able to just turn the power off and then globally not talk to close family members. Definitely worth thinking about for the future. The other thing we were discussing as well, that Frances Haugen brought up, was safeguarding and the importance of making sure, particularly with Instagram, that we look after our privacy rights and consider the harm of platforms like Instagram on young people. Do you think that there’s an issue at the moment with platforms like Facebook and Instagram, and is it having an impact on young people, particularly young girls and their health, or do you think that it’s okay for them to use it as frequently sometimes as they do; what’s your view?

8:29 Bruce Hoffman:

I think it’s pretty clear that everything that we hear, read, see and study shows that there’s an impact; things have started. I mean it was a lot more Facebook originally with shaming and people committing suicide as a result; there’s been quite a few studies on the dependency. It’s kind of interesting, going back to the previous piece of the conversation, if we go back in time we didn’t have this dependency; if you couldn’t communicate for five hours, that wouldn’t have been a surprise. You might have said: ‘hey, I could communicate within a day’; it’s that dependency on things being instant that has changed the entire generation, and you’re right, everybody’s expecting instant feedback. If you don’t get positive feedback and you’ve created this dependency that’s really occurred on social media across the board, people begin to feel real pain – it’s psychological; there’s mental health issues that have cropped up and are getting significantly worse, and it’s just odd that society has created such a dependence on this. We’ll talk tomorrow a little bit about Covid, but it’s only increased our dependency on this connection. I like what Kevin said a moment ago – ‘is the internet down’, right? It’s like the internet is now water to us – it’s the same as air, electricity, food; the internet is the next piece of that, we can’t do without it, and it can cause a lot of pain from people just thumbs downing or creating negativity around what we post. I’ll let others speak about it, but I know it’s a pretty common theme that we hear.

10:26 Susanna Raj:

I think what struck me from Frances Haugen’s interview and also from reading her brief on the documents that she released to the public is that Facebook was putting profits over the public good, what was good for the company was not good for the public, so it was not even aligned properly. Building on conflict is their entire business model, so there was no way for them to get around it; even though they had a civic integrity panel put in before the elections. They removed the panel once the elections were over, and were like ‘okay, the election went fine, let’s dismantle this panel now, we are okay’ – and then January 6th happened and they were like ‘oh no, it was not okay’. But they just didn’t realize – what they were doing was optimizing for engagement; by naturally optimizing for engagement they were optimizing for hatred and optimizing for negativity and also for anti-democratic forces. What are your thoughts on having a business model that is diametrically opposed to the public good?

12:27 Shilpi Agarwal:

I know I’m not on the panel, but I think this is such an interesting question! I think that how the corporate board is designed is why they have incentivized profit; to the point that even if you want to put ethics at the forefront, sometimes it is hard. I don’t justify whatever is happening with Facebook, and now they are a very big company and they can do whatever they want, but imagine something similar happening to a smaller startup. That starts with the design and the senior leadership; I always say that ethics has to start from the top down, it can’t go from the bottom up. But if the company doesn’t make profit, it’s a cycle – you don’t get to the second round of funding or the third round of funding unless you show that you’re making profits. Not necessarily that you have to incentivize profits over the user, but this whole funding mechanism works that way. All I’m saying is that you have to put in a conscious effort to make that decision, draw that line, and be ready to walk away from the next round of funding to stick to your ethical principles. It’s not going to be easy but somebody has to do it.

14:14 Bruce Hoffman:

The only thing I was going to say in there was on sensationalism – if we go back to the comments I was making to the team, if you were to watch the news, 95% of it is sensationalism and showing all the negativity. What started before this is only continuing; and Susanna you made a really good point to me, that it has only become worse because now people are interacting with it directly where before they really couldn’t do that with the news. It’s almost like Facebook and any others just took what the world was really asking for, and just made it worse. Just saying: ‘you really now can access, instantly, all this information, now react to it and put your own spin to it.’ You can have viral stuff that’s fun, but boy, you can have viral stuff that just has incredible negativity. It seems like it should be their responsibility to do something about it. I think that gets back to something said at the beginning about whether regulation is coming, and will it help? Or is it always just going to be too late and not enough?

15:34 Shilpi Agarwal:

Regulation is a very low bar. To your point, has someone seen the Apple News series on the morning show? It’s exactly what Bruce talked about – all the media people always say they only tell what the audience wants to hear, they only make films that the audience wants to see, they only bring out the role models in society that the society believes in or wants to see at that time. It’s true and not so true, because if media people don’t do their job with bringing things that people are not ready for, how will people ever be ready for the right thing?

16:26 Susanna Raj:

It was really telling that in response to all this backlash, the policy director of communications at Facebook; this was her official response, she actually said: “solutions have not yet been found by anyone, so if you know of any company that has found a solution to us, let us know.” I mean, seriously?! What do you guys think, have solutions not been found yet to this problem?

17:05 Bruce Hoffman:

It sure seems like there are!

17:06 Kevin Diaz:

Companies just need to think about it a little bit more, rather than just saying that there won’t be a solution or that it’s impossible, that kind of thing. To Shilpi’s point as well, ethics starts from the top down, I feel like in the future once C-suite level executives start appreciating people more than their profit margins – easier said than done, because ethics is opposite to their business goals – that’s the direction we need to head towards. Looking at people as people rather than just numbers. Again, it’s easier said than done, but if we have that mindset I think we can come up with solutions towards this.

18:02 Sam Wigglesworth:

I think you’re right, it’s a really good point. I think it’s about making C-suite executives a little bit more accountable, building a bit more transparency within organizations and knowing what’s happening if you’re building a piece of technology that could potentially change the world and benefit people. It’s about putting the user first. We need to know what’s happening, how our data is being used, how the algorithm works within a particular application that’s been developed, and truly understand, to build that transparency and trust. We need that guidance as well, companies need guidance and those frameworks as well to support them.

18:56 Bruce Hoffman:

We keep hearing about all these new services that are are also free that actually protect your privacy, but the other things have become so big, how how do you even really make a change? It almost feels like it’s such a small place – I mean, you’re right, I think we can all agree that change is needed. That’s an easy thing to say; making it happen is difficult. What we’re here for is to try to help; all of our voices I think have some some good agreement there, but all we can do is continue to push.

19:37 Sam Wigglesworth:

I think Frances Haugen also said that social media can be used for good, and we need to focus on that – positive AI and social media for the future. How we can create that trust is also important, carrying on with that message as well. It can be used for good and we don’t want to turn away from it. It connects billions of people across the world and we rely on it.

20:17 Susanna Raj:

So that brings us to the question of moderation: how to moderate content, how to moderate opinions that you don’t necessarily disagree with. While somebody is behind the scenes, moderating the content of Facebook, they have to look at the freedom of speech of both parties and see which ones should they moderate and which one becomes hate speech – where do you draw the line? Those are the kind of questions that we need to ask ourselves also, when we post content and comments – it’s our freedom of speech, but then how far should it go, where are the boundaries? What does everyone think on that?

21:00 Kevin Diaz:

I remember studying this and I know that the conclusion of some of the research that I’ve read is that the human brain is just naturally attracted to divisiveness. I think Facebook realized this early on, and obviously it’s not the best thing to leverage or take advantage of, but they did. That’s why when you see the comment sections it’s usually more than just comments, they’re arguments, most of the time. I think another realization that Facebook had early on is that a lot of engagement with or consumption of the content itself happens in the comment section. A lot of people prefer finding their information from the comment section versus the actual content that was posted, just to maybe validate what was posted or find out more information, get reviews and opinions on it. All of that combined, I think Facebook is just leveraging people’s emotions that way, the brain’s natural inclination towards divisiveness.

22:13 Susanna Raj:

We are socially motivated – if you share something like a news article with me, and then you post a one- or two-liner, I’m more interested in what you said than content itself. When we are socially connected with each other we trust the person who has sent that content to us more than the content itself, and we don’t bother reading it and seeing whether it actually agrees with our opinions. I won’t say that Facebook did not know the comments section would be their most engaging, profit-making section – they do know that, and that’s why most of their algorithm works behind the scenes on the comment section, not just on the post. They do know; they did have psychologists there to manipulate us, but I wish they would have psychologists on board to manipulate us in the other direction as well.

23:19 Sam Wigglesworth:

It’s a good point, we don’t want to be dividing and going against free speech or democracy, but I think the amplification of negativity is something that needs to be looked at. We need to continue having that debate on how we move forward with that to get the balance right.

23:45 Bruce Hoffman:

At least amplify positivity also.

23:48 Sam Wigglesworth:

Yeah definitely. More transparency definitely, a more positive future and starting from top down as Shilpi said. Do you have any other comments to make about what was raised? Frances covered quite a lot but do you feel that, for example, we can move forward and still use these tools for good and do you think we’re going to be moving forward towards greater data privacy and protection? What are your views on the future in terms of social media and some of the risks?

24:45 Kevin Diaz:

I wanted to also address, because I know we talked about it, mental health – rather we mentioned it, but we didn’t unpack it enough, we spoke about it particularly in the context of the younger generation. Besides the whole community aspect of Facebook, there’s also these features and tools within it that we should also acknowledge, such as the use of filters. From a content creation standpoint, I think when we talk about mental health, filters also need moderation; I know we talked about moderation of comments and discussions, but filters as well need moderation, or curating at least, because as we know our filters create this superficial sense of beauty or attractiveness, and that spirals into most of the mental health challenges that we face these days. Thinking about things like these specific tools and features as well is important. I know that the community aspect, the instant messaging aspect is also important but I think these features will become more relevant in the future as well. To the point that was addressed earlier about taking all of these negative challenges and finding the good in them, filters have been used for good – for example virtual shopping. Maybe explore that a little bit more, versus the way it has been used negatively or in a more addictive, toxic angle so far.

26:44 Susanna Raj:

I think we do have some questions as well, right – so maybe we should take those?

26:51 Sam Wigglesworth:

Yeah, let’s do that. Alberto says: ‘if this is an internal effort, should we not be bringing third parties on board to help find solutions?’ What do we think?

27:14 Susanna Raj:

That’s always the question of internal versus external audits. I don’t think it’s gonna work unless it’s internal. In countries like the United States we have traffic lights and traffic signals, and everyone follows it even in the middle of the night when there are no cars on the road; you will instinctively stop at the red light even though there is nobody there – that has to come from inside; so it is it has to be internal first and external next. That is my opinion: external rules can be there but people might not comply with it, everybody from the top to the bottom has to internally feel it, and it has to be applied in that realm where you will do the right thing even if nobody’s watching you.

28:06 Bruce Hoffman:

That’s right, it’s a cultural change that we would hope happens across the board, although we know that when we look at the world there’s so much hate and challenge, that it’s hard to see the cultural change actually occur. Who decides? I think the people decide to actually do things differently – it can’t be forced, and the corporations tend to just take advantage of whatever they can until they’re really pushed to do something different.

28:41 Shilpi Agarwal:

Yeah, and I think that’s why organizations like DataEthics4All have an important role to play. What do you guys think about that?

28:52 Kevin Diaz:

Yeah a third party could be us or an organization like us, but I think the comment also asks whether this third-party resource should be brought on board to help find a solution, but I think the question really is: even if there are solutions, will the company implement it? Because for them it’s more of a question of the profit of that solution. With solutions also comes testing, and just that experiment phase of it can cost the business millions of dollars, depending on the business – if we’re talking about Facebook still – and I think Facebook or companies like Facebook would just be hesitant to put in that work. But still, there’s definitely good that can come out of suggesting these solutions, and hopefully Facebook does take that leap of faith and try out some of these good solutions that are out there.

29:50 Shilpi Agarwal:

It’s all about cultural things like those we touched upon. We have to draw a fine line; we have to walk this very difficult line where we have to make an ethical choice. It’s like everything in life, right? My parents always said, I’m sure everybody’s parents said the same thing, that whenever life gives you a fork in the road, you can take the easy path that seems to be shorter and easier right now, that is not doing the right thing but will lead you to greater success in the short run. But the longer path – the righteous path – may take you longer to get there, but it is the right way to get there. I think on that note, thank you my panelists and my leadership team, thank you so much for this wonderful inspiring discussion! I’m sure the discussion will continue – we can’t solve this in one discussion, but it’s the start and very much needed.

Susanna-Raj-Speaker AI DIET World event 2021
Kevin-Rose-Dias-Speaker AI DIET world event 2021
Sam-Wigglesworth- Speaker AI DIET World 2021
Bruce-Hoffman-Speaker AI DIET world 2021

Leadership Team, DataEthics4All

AI DIET World 2021 also featured Senior Leaders from Salesforce, Google, CannonDesign and Data Science Central among others.

For Media Inquires, Please email us connect@dataethics4all.org

 

Come, Let’s Build a Better AI World Together!