

And so it used to be that we did very little of it, and now we have to do a lot of it, because we have jobs to do. But if we don't do these stories, we don't get distributed. We know our constituents don't like this. "And now if we don't publish angry, hateful, polarizing, divisive content, crickets. "One of the most shocking pieces of information that I brought out of Facebook that I think is essential to this disclosure is political parties have been quoted, in Facebook's own research, saying, we know you changed how you pick out the content that goes in the home feed," said Haugen. And publishers are saying, 'Oh, if I do more angry, polarizing, divisive content, I get more money.' Facebook has set up a system of incentives that is pulling people apart." "When you have a system that you know can be hacked with anger, it's easier to provoke people into anger. "Facebook's mission is to connect people all around the world," said Haugen. Whistleblower: Facebook is misleading the public on progress against hate speech, violence, misinformation.Facebook's response to 60 Minutes' report, "The Facebook Whistleblower".Evidence of that, she said, is in the company's own internal research. On Sunday, in her first interview, Haugen told 60 Minutes correspondent Scott Pelley about what she called "systemic" problems with the platform's ranking algorithm that led to the amplification of "angry content" and divisiveness. 60 Minutes obtained the documents from a Congressional source. Haugen quit Facebook on her own accord and left with thousands of pages of internal research and communications that she shared with the Securities and Exchange Commission. Frances Haugen spent 15 years working for some of the largest social media companies in the world including Google, Pinterest, and until May, Facebook.
