Page Nav

HIDE

Pages

Classic Header

{fbt_classic_header}

Breaking News:

latest

'Instagram may NEVER be safe for 14-year-olds': Whistleblower Frances Haugen quotes Facebook's OWN research - and says social media giant won't sacrifice a 'slither of profit' as she reveals massive dossier of damning internal evidence

 Facebook  whistleblower Frances Haugen today issued a stark warning to parents that Instagram 'may never be safe for 14-year-olds' ...

 Facebook whistleblower Frances Haugen today issued a stark warning to parents that Instagram 'may never be safe for 14-year-olds' as she said the tech giant's own research found children are turning to addicts and bullying was 'following them into their bedrooms'. 

The former employee said Facebook's knew Instagram is dangerous for young people but did not want to act because 'young users are the future of the platform and the earlier they get them the more likely they'll get them hooked'. She said the platform was unwilling to sacrifice 'even a slither of profit' for safety improvements. 

Her appearance coincided with her release of a fresh trove of documents which sensationally revealed CEO Mark Zuckerberg 'personally intervened' to allow US right wingers to 'say what they wanted' on the platform. 

The memos - which have been dubbed 'the Facebook Papers' and comprise internal research she secretly copied while working at the firm's 'integrity unit' - also revealed how bosses ignored internal complaints from staff for years to put profits first, 'lied' to investors and sought to shield Mr Zuckerberg from public scrutiny. 

Today, Ms Haugen told MPs said Facebook had meant childhood bullying was no longer confined to the classroom. 

'Facebook's own research says now the bullying follows children home, it goes into their bedrooms. The last thing they see at night is someone being cruel to them. The first thing they see in the morning is a hateful statement and that is just so much worse.'

She claimed that the firm's own research found that Instagram is more dangerous than other social media such as TikTok and Snapchat, because the platform is focused on 'social comparison about bodies, about people's lifestyles, and that's what ends up being worse for kids'.

Ms Haugen also cast doubt on whether Instagram could ever be made safe for children. At present, you must be at least 13 years old to use the service, though it is easy for users to lie about their age.

Facebook was developing an Instagram Kids specifically for children but the idea was put on hold earlier this year due to the raft of concerns.

'I am deeply worried that it may not be possible to make Instagram safe for 14-year-olds and I sincerely doubt it is possible to make it safe for a 10-year-old,' Ms Haugen said.

During today's hearing, Ms Haugen said Facebook's algorithm 'prioritises' extreme content and although the firm is 'very good at dancing with data' it is 'unquestionably' making online hate worse and pushing users towards extremism.  

The whistleblower was addressing a parliamentary committee scrutinising the government's Online Safety Bill, which would place a duty of care on social media companies to protect users – with the threat of substantial fines of up to 10% of their global revenue if they fail to do so.

Opening the session, she said: 'I am extremely, extremely worried about the state of our societies. I am extremely concerned about engagement-based ranking, which prioritises extreme content.' 

Ms Haugen said Facebook was reluctant to sacrifice even a 'slither of profit' to make the platform safer, and said the UK could be particularly vulnerable because its automated safety systems may be more effective with US English than British English. 

'I am deeply concerned about their underinvestment in non-English languages and how they mislead the public in how they are supporting them,' she said.

'UK English is sufficiently different that I would be unsurprised if the safety systems that they developed primarily for American English were actually underenforcing in the UK. Facebook should have to disclose those dialectical differences.'

She said that one of the effects of Facebook's algorithm was to give hateful advertising greater traction, meaning it was 'cheaper' for companies and pressure groups to produce angry messages rather than positive ones. 

Ms Haugen described this process as 'subsiding hate'. 

Responding to this afternoon's session, Home Secretary Priti Patel said 'tech companies have a moral duty to keep their users safe' following a meeting with Facebook whistleblower Frances Haugen. 

Ms Patel said it was a 'constructive meeting' on online safety.  

Firing off a barrage of devastating allegations that will further trash Facebook's already tattered reputation, Ms Haugen claimed - 
  • Facebook's algorithm prioritises hate speech by showing people content based on how much engagement it has received;
  • 'Anger and hate' is the 'best way to grow' on the platform, and 'bad actors' are playing the algorithm by making their content more hateful;
  • The world is 'at the opening stages of a horrific novel' due to extremism spreading via social media unless regulators act;
  • Facebook is reluctant to sacrifice 'even slithers of profit' to prioritise online safety and 'unquestionably' makes online hate worse;
  • Children's relationship with platforms like Facebook is an 'addicts' narrative', with youngsters saying social media sites make them unhappy but they are unable to stop using them;
  • Facebook could tackle this problem but 'they don't because they know that young users are the future of the platform and the earlier they get them the sooner they get them hooked'; 
  • Platform had demonstrated 'negligence' and 'ignorance', but resisted the term 'malevolence' as this 'implies intent';
  • 'Underinvestment' in foreign languages means Facebook is less able to monitor content not in US English.  

Facebook whistleblower Frances Haugen is appearing before a parliamentary committee scrutinising the government's draft legislation to crack down on harmful online content

The data scientist's appearance coincided with her release of a fresh trove of documents which sensationally revealed CEO Mark Zuckerberg 'personally intervened' to allow US right wingers to 'say what they wanted' on the platform

The data scientist's appearance coincided with her release of a fresh trove of documents which sensationally revealed CEO Mark Zuckerberg 'personally intervened' to allow US right wingers to 'say what they wanted' on the platform 


Speaking to MPs today, Ms Haugen likened failures at Facebook to an oil spill.

'I came forward now because now is the most critical time to act,' she told the select committee. When we see something like an oil spill, that oil spill doesn't make it harder for a society to regulate oil companies.

'But right now the failures of Facebook are making it harder for us to regulate Facebook.'

The whistleblower said she had 'no doubt' that events like the storming of the US Capitol would happen in the future due to Facebook's ranking system prioritising inflammatory content. 

She said the problem could get worse due to the social media giant prioritising the creation on large Facebook groups so people spend more time on the network.  

'Facebook has been trying to make people spend more time on Facebook, and the only way they can do that is by multiplying the content that already exists on the platform with things like groups and reshares,' she said. 

'One group might produced hundreds of pieces of content a day, but only three get delivered. Only the ones most likely to spread will go out.' 

Ms Haugen said Facebook groups were increasingly acting as 'echo chambers' that are pushing people towards more extreme beliefs.  

'You see a normalisation of hate and dehumanising others, and that's what leads to violent incidents,' she said. 

She added that the platform was 'hurting the most vulnerable among us' and leading people down 'rabbit holes'.

'Facebook has studied who has been most exposed to misinformation and it is ... people who are socially isolated,' she told the select committee.

'I am deeply concerned that they have made a product that can lead people away from their real communities and isolate them in these rabbit holes and these filter bubbles.

'What you find is that when people are sent targeted misinformation to a community it can make it hard to reintegrate into wider society because now you don't have shared facts.' 

The whistleblower argued regulation could benefit Facebook in the long run by making it a 'more pleasant' place to be.   

She said that Twitter and Google were 'far more transparent' than Facebook, as she called for Mr Zuckerberg to hire 10,000 extra engineers to work on safety instead of 10,000 engineers to build its new 'metaverse' initiative. 

Ms Haugen said that 'anger and hate' is the 'best way to grow' on Facebook, and said bad actors were playing the algorithm by making their content more hateful. 

'The current system is biased towards bad actors and those who push Facebook to the extremes.' 

The whistleblower urged ministers to take into account the harm Facebook does to society as a whole rather than just individuals when considering new regulation. 

'Situations like [ethnic violence in] Ethiopia are just the opening chapters of a novel that is going to be horrific to read. 

'Facebook is closing the door on us being able to act. We have a slight window of time to regain people control over AI - we have to take advantage of this moment.'

Ms Haugen urged MPs to regulate paid-for advertisements on Facebook, because hateful ones were drawing in more users. 

'We are literally subsidising hate on these platforms,' she said. 


'It is substantially cheaper to run an angry hateful divisive ad than it is to run a compassionate, empathetic ad.'

The whistleblower said Facebook was reluctant to sacrifice 'even slithers of profit' to prioritise online safety. 

Ms Haugen said systems for reporting employee concerns at Facebook were a 'huge weak spot' at the company.


'When I worked on counter espionage, I saw things where I was concerned about national security and I had no idea how to escalate those because I didn't have faith in my chain of command at that point,' she said.

'We were told to accept under-resourcing.

'I flagged repeatedly when I worked on civic integrity that I felt that critical teams were understaffed.

'Right now there's no incentives internally, that if you make noise saying we need more help, like, people will not get rallied around for help, because everyone is underwater.' 

Ms Haugen told the parliamentary select committee that the social media giant was 'unquestionably' making online hate worse.

'We didn't invent hate, we didn't invent ethnic violence. And that is not the question.

'The question is what is Facebook doing to amplify or expand hate ... or ethnic violence?'  

Ms Haugen said she 'sincerely doubted' that it was possible for Instagram to be made safe for children and that the platform promoted an 'addict's narrative'.

'Children don't have as good self regulation as adults do, that's why they're not allowed to buy cigarettes,' she said.

'When kids describe their usage of Instagram, Facebook's own research describes it as 'an addict's narrative'.

'The kids say 'this makes me unhappy, I don't have the ability to control my usage of it, and I feel if I left it would make me ostracised'.'

She continued: 'I am deeply worried that it may not be possible to make Instagram safe for a 14-year-old and I sincerely doubt that it is possible to make it safe for a 10-year-old.'

Ms Haugen said Facebook could estimate people's ages with 'a great deal of precision' but did not act to stop under-age users.

'Facebook could make a huge dent on this if they wanted to and they don't because they know that young users are the future of the platform,' she told a parliamentary select committee.

'The earlier they get them, the more likely they'll get them hooked.'

Facebook has previously outlined plans to set up a so-called Instagram Kids. It argues that under-13s already use Instagram despite age barriers, and that the new app would be safer for them.

Ms Haugen first aired her bombshell revelations in front of the US Senate earlier this month, where she argued a federal regulator is needed to oversee digital giants like Facebook.  

The draft Online Safety Bill proposes something similar by creating a regulator that would monitor Big Tech's progress with removing harmful or illegal content from their platforms, such as terrorist material or child sex abuse images.

Ministers also want social media companies to stamp down on online abuse by anonymous trolls. 

Damian Collins, Chair of the Joint Committee on the Draft Online Safety bill, called Ms Haugen's appearance 'quite a big moment'. 

'This is a moment, sort of like Cambridge Analytica, but possibly bigger in that I think it provides a real window into the soul of these companies,' he said.  

Mr Collins was referring to the 2018 debacle involving data-mining firm Cambridge Analytica, which gathered details on as many as 87 million Facebook users without their permission.

Mr Haugen first discussed her huge tranche of leaked internal Facebook documents in front of the US Senate earlier this month

Mr Haugen first discussed her huge tranche of leaked internal Facebook documents in front of the US Senate earlier this month 


The committee has already heard from another Facebook whistleblower, Sophie Zhang, who raised the alarm after finding evidence of online political manipulation in countries such as Honduras and Azerbaijan before she was fired. 

It comes as concerns were raised that details of the new legislation could be leaked to Facebook  by civil servants who 'want to work for government for four years before getting job at tech giants' 

The alarm was raised after an online harms issue known only to a few people at the Department for Digital, Culture, Media and Sport was raised by a senior executive at Facebook in a recent meeting.

Jobs taken by former senior civil servants in the private sector are meant to be scrutinised by the Advisory Committee on Business Appointments (Acoba). But its powers are weak and more junior appointments are not vetted.

A source lashed out at department mandarins, telling the Times: 'The problem is that DCMS officials think it's their job to work there for four years then get a job at Facebook.

'They don't get scrutinised by Acoba except at the most senior level.'

Several DCMS officials have gone on to work for Facebook recently, after working elsewhere before joining Facebook. 

There is no suggestion they have solicited information from former Civil Service colleagues.


 

No comments