Home   News   Article

Subscribe Now

Facebook founder ‘running away to metaverse’, says whistleblower Frances Haugen at Minderoo event




Mark Zuckerberg, the Facebook co-founder and chief executive, is “running away to the metaverse” to avoid resolving the divisive chaos his social media platform has unleashed on the world, said whistleblower and data scientist Frances Haugen in Cambridge on Friday evening.

‘Frances Haugen: Can We Trust Tech?’ at the Babbage Theatre with, from left, Prof Gina Neff, executive director, Minderoo Centre for Technology and Democracy, and Prof John Naughton, chair, advisory board, Minderoo Centre for Technology and Democracy. Picture: Keith Heppell
‘Frances Haugen: Can We Trust Tech?’ at the Babbage Theatre with, from left, Prof Gina Neff, executive director, Minderoo Centre for Technology and Democracy, and Prof John Naughton, chair, advisory board, Minderoo Centre for Technology and Democracy. Picture: Keith Heppell

Speaking to a packed house at the Babbage lecture theatre, the former Facebook software executive pointed the finger at the Silicon Valley company’s director of communications, Nick Clegg, for enabling and encouraging the billionaire’s flights of fancy, and called on those close to Zuckerberg to encourage him to address the crisis because “until Mark gets to heal he can’t heal us”.

Born in Iowa in 1984, Frances Haugen’s degree was in electrical and computer engineering, with an MBA acquired at Harvard Business School in 2011. She began her career at Google as a software engineer that year, rising to product manager. After four years at Google she went to work for Yelp, and then Pinterest. She joined Facebook as a product manager in June 2019 and stayed until May 2021.

During her time at Facebook, Haugen became increasingly alarmed by the choices the company makes, “prioritising their own profits over public safety and putting people’s lives at risk”. Last summer, she handed tens of thousands of pages of internal Facebook documents to the US Congress and the Securities and Exchange Commission (SEC). The documents were published in the Wall Street Journal as ‘The Facebook Files’.

The tranche of data reveal that Facebook knew its products were damaging teenagers’ mental health, fomented ethnic violence in countries such as Ethiopia, and allowed misinformation to spread before the Washington riots on January 6, 2021. They portray a company that is aware of the harm that the algorithms used by the platform are causing, but is “unwilling or unable to act”.

Facebook has already launched the Horizons Workroom app, giving an early example of what a metaverse meeting could look like. Image: Facebook
Facebook has already launched the Horizons Workroom app, giving an early example of what a metaverse meeting could look like. Image: Facebook

The Cambridge event, titled ‘Frances Haugen: Can We Trust Tech?’, was organised by The Minderoo Centre for Technology and Democracy, an independent team of academic researchers “radically rethinking the power relationships between digital technologies, society and our planet” at the University of Cambridge.

The whistleblower gave an introductory talk followed by a Q&A with her on-stage companions, Prof Gina Neff, executive director of the centre, and Prof John Naughton, chair of the centre’s advisory board and the Observer tech columnist.

The core issue of the Facebook phenomenon, Ms Haugen said, isn’t their fault – human nature (as it stands) ensures that “polarising, extreme content gets the most engagement and that has huge consequences”.

What Facebook’s algorithms do is to push that extreme content higher up the news feed, Ms Haugen says.

“Facebook made a comment on this in the last couple of weeks, on Mark Zuckerberg’s personal page in fact, that users’ news feeds will have less content from family and friends – before it was more from family and friends. Family and friends are not the problem. If you get only content that you consent to, it’s less aggressive and confrontational.”

Frances Haugen
Frances Haugen

Why is Facebook putting this unasked-for content higher up our feeds?

“I think they’re scared people are moving on and that’s really frightening for them.”

One of the problems for users is that we have no say over what goes on with the way content is presented, she said.

“When I joined Facebook I thought maybe 300 or 400 people would understand the algorithms, but it’s only 200 people. No system can be understood like that. We need to bring people to the table, it needs 10 or 15 per cent of people to have a seat at the table.

“We’re entering an era where technology is moving faster and faster. With cars you could take an engine apart and work on it. Now, we don’t get to understand what’s going on behind our screens.”

Ms Haugen stressed that social media is so early-stage that “my first product manager at Google only started in 2006 – it’s a very young field”.

She added: “We all need to be aware that Facebook is not a mirror, it’s an amplifier. Not all information is distributed equally – good speech doesn’t always get to counter bad speech. Truth is nuanced.”

The fact is, humanity has never been so thoroughly immersed in the brain games that social media presents us with, and has yet to adopt coping mechanisms.

“The printing press was not disruptive,” is how Frances puts it. “Teaching people to read was disruptive.”

But in spite of her life-defining experience at Menlo Park, she isn’t anti-Facebook.

“It’s possible for an ethical person to work at Facebook,” she said. “We need good people to have their eyes open to see how they work. I think we should have publicly-founded fellowships to work at these institutions. These systems will change when good people work there.”

‘Frances Haugen: Can We Trust Tech?’ at the Babbage Theatre with, from left, Prof Gina Neff, executive director, Minderoo Centre for Technology and Democracy, and Prof John Naughton, chair, advisory board, Minderoo Centre for Technology and Democracy. Picture: Keith Heppell
‘Frances Haugen: Can We Trust Tech?’ at the Babbage Theatre with, from left, Prof Gina Neff, executive director, Minderoo Centre for Technology and Democracy, and Prof John Naughton, chair, advisory board, Minderoo Centre for Technology and Democracy. Picture: Keith Heppell

Encouraging people to “take a step back”, she considered the history of literacy.

“Start in 1900. By World War One, we discovered that if you didn’t live in a city you probably weren’t literate. We need to live in a society where everyone is literate – education matters if you want to check power.”

The audience was invited to ask questions for the final part of the session. Someone asked if Mark Zuckerberg’s professed intention to change the way Facebook operates was genuine. It turned out that Ms Haugen’s most radical analysis of Facebook’s problems centre on the character of its totemic instigator.

“I’m not a psychologist,” she replied, “and I’ve never met Mark, but I know he missed out on a lot of foundational experiences. He left college at 19 and [straight away] got to be the boss. In addiction classes they say if you start drinking at 16, you use it as a crutch.

“In 2016 he had a traumatic year. When he went to Africa they said he was a saviour, he was riding high – and six or seven months later [following Donald Trump’s election to the US presidency] it was like he was destroying democracy. Maybe that’s why he started the metaverse.”

She went on: “There’s a danger that comes from doing tests in isolation. The word ‘metaverse’ actually comes from a dystopian novel, it’s not an aspiration, it’s about a society in decay.”

Author Neal Stephenson coined the term ‘metaverse’ in his 1992 science-fiction novel Snow Crash, which envisions a virtual reality-based parallel universe in which digital avatars explore an online world as a way of escaping a dystopian reality.

“He spends all day on metaverse,” Frances asserted, adding: “There is a danger of making a substitute virtual life. What are the consequences? We need to consider that because we are beginning to go down that road where people are spending all day with their headsets on.”

What of those around him? Nick Clegg, the ex-leader of the Liberal Democrats and former UK deputy prime minister, joined Facebook as VP global affairs in October 2018 and is currently president of global affairs for Meta.

“Until Mark gets to heal he can’t heal us. He’s running away to the Metaverse, and the really sad thing is that Nick Clegg...” She paused because the audience had burst into laughter for the first time at reference… “Mark Zuckerberg had no clue aged 19 what he was stealing from himself, but Nick Clegg is telling Mark Zuckerberg that ‘people are jealous of you’ and ‘you’re going through a growth experience, you have to accept you’re going to be very unpopular sometimes’.

Mark Zuckerberg announcing his new company brand, Meta. Picture: Meta
Mark Zuckerberg announcing his new company brand, Meta. Picture: Meta

“Mark is being hurt by his own choices – it’s about power and the powerless. He has solutions he could release today that don’t touch content… it has zero impact on profit.”

Frances’ testimony at the UK Parliament’s Joint Committee on the draft Online Safety Bill in October exposed long-held concerns about violence and instability in Myanmar and Ethiopia being stimulated by Facebook: she claimed that Facebook was “literally fanning ethnic violence” in places like Ethiopia because it was not policing its service adequately outside the US.

“If Mark is suffering because he’s standing up for freedom of speech, it will cost him 0.1 or 0.2 per cent of profit. In some countries reshares are 35 per cent of content.”

Meta – formerly Facebook Inc – achieved $117bn turnover in 2021. There are concerns around the lack of investment in the moderating teams required to police the platform. Ms Haugen notes that the safety team “is attuned for American English”.

“In 14 languages, four per cent of content is misinformation…. Which moderators speak the languages, where do they sit? The Ukrainian language moderators were sitting in ‘Russia’ up to last September. Technique needs to live in democracy’s house.”

The comment comes as Mark Zuckerberg was sued by a Washington DC attorney general over what his precise role was in the Cambridge Analytica scandal.

She concluded: “Mark is really isolated… he needs to sell some of his shares.”

Prof Neff then asked: “What can we as individuals do?”

Frances has widely proclaimed the dangers of the model Facebook used in Africa, where it has provided free internet – all of it is hosted by Facebook, even the business pages and access to medicines. Its Free Basics internet service, launched in 2015, now covers 32 African countries.

“The reality is that people in places like Ethiopia are saying ‘Facebook let us down, there was no support’,” Haugen replied. “It has become so disruptive that they are saying ‘we’ll build our own thing’, and someone has said WeChat could be the new model – that’s based in China. We’re about to face a cold war between a libertarian internet and an authoritarian internet.

“If we lost Africa that impacts on all of us. We have to start talking about these things because we still have time to live in a different future.”

A request was made to Meta to contribute to the discussion.



Comments | 0
This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies - Learn More