By Lightspring @ Shutterstock.com

A former Facebook employee, Frances Haugen, has blown the whistle on some of the company’s procedures, sending information to legislators, regulators, and The Wall Street Journal. Haugen believes the company doesn’t do enough to limit certain kinds of content on its platform. Jeff Horwitz reports:

Ms. Haugen was initially asked to build tools to study the potentially malicious targeting of information at specific communities. Her team, comprising her and four other new hires, was given three months to build a system to detect the practice, a schedule she considered implausible. She didn’t succeed, and received a poor initial review, she said. She recalled a senior manager telling her that people at Facebook accomplish what needs to be done with far less resources than anyone would think possible.

Around her, she saw small bands of employees confronting large problems. The core team responsible for detecting and combating human exploitation—which included slavery, forced prostitution and organ selling—included just a few investigators, she said.

“I would ask why more people weren’t being hired,” she said. “Facebook acted like it was powerless to staff these teams.”

Mr. Stone of Facebook said, “We’ve invested heavily in people and technology to keep our platform safe, and have made fighting misinformation and providing authoritative information a priority.”

Ms. Haugen said the company seemed unwilling to accept initiatives to improve safety if that would make it harder to attract and engage users, discouraging her and other employees.

“What did we do? We built a giant machine that optimizes for engagement, whether or not it is real,” read a presentation from the Connections Integrity team, an umbrella group tasked with “shaping a healthy public content ecosystem,” in the fall of 2019. The presentation described viral misinformation and societal violence as among the results.

Ms. Haugen came to see herself and the Civic Integrity team as an understaffed cleanup crew.

She worried about the dangers that Facebook might pose in societies gaining access to the internet for the first time, she said, and saw Myanmar’s social media-fueled genocide as a template, not a fluke.

She talked about her concerns with her mother, the priest, who advised her that if she thought lives were on the line, she should do what she could to save those lives.

Facebook’s Mr. Stone said that the company’s goal was to provide a safe, positive experience for its billions of users. “Hosting hateful or harmful content is bad for our community, bad for advertisers, and ultimately, bad for our business,” he said.

On Dec. 2, 2020, the founder and chief of the team, Samidh Chakrabarti, called an all-hands teleconference meeting. From her San Francisco apartment, Ms. Haugen listened to him announce that Facebook was dissolving the team and shuffling its members into other parts of the company’s integrity division, the broader group tasked with improving the quality and trustworthiness of the platform’s content.

Mr. Chakrabarti praised what the team had accomplished “at the expense of our family, our friends and our health,” according to Ms. Haugen and another person at the talk. He announced he was taking a leave of absence to recharge, but urged his staff to fight on and to express themselves “constructively and respectfully” when they see Facebook at risk of putting short-term interests above the long-term needs to the community. Mr. Chakrabarti resigned in August. He didn’t respond to requests for comment.

That evening after the meeting, Ms. Haugen sent an encrypted text to a Journal reporter who had contacted her weeks earlier. Given her work on a team that focused in part on counterespionage, she was especially cautious and asked him to prove who he was.

The U.S. Capitol riot came weeks later, and she said she was dismayed when Facebook publicly played down its connection to the violence despite widespread internal concern that its platforms were enabling dangerous social movements.

Mr. Stone of Facebook called any implication that the company caused the riot absurd, noting the role of public figures in encouraging it. “We have a long track record of effective cooperation with law enforcement, including the agencies responsible for addressing threats of domestic terrorism,” he said.

In March, Ms. Haugen left the Bay Area to take up residence in Puerto Rico, expecting to continue working for Facebook remotely.

Ms. Haugen had expected there wouldn’t be much left on Facebook Workplace that wasn’t already either written about or hidden away. Workplace is a regular source of leaks, and for years the company has been tightening access to sensitive material.

To her surprise, she found that attorney-client-privileged documents were posted in open forums. So were presentations to Chief Executive Mark Zuckerberg —sometimes in draft form, with notes from top company executives included.

Virtually any of Facebook’s more than 60,000 employees could have accessed the same documents, she said.

To guide her review, Ms. Haugen said she traced the careers of colleagues she admired, tracking their experiments, research notes and proposed interventions. Often the work ended in frustrated “badge posts,” goodbye notes that included denunciations of Facebook’s failure to take responsibility for harms it caused, she said. The researchers’ career arcs became a framework for the material that would ultimately be provided to the SEC, members of Congress and the Journal.

The more she read, she said, the more she wondered if it was even possible to build automated recommendation systems safely, an unpleasant thought for someone whose career focused on designing them. “I have a lot of compassion for people spending their lives working on these things,” she said. “Imagine finding out your product is harming people—it’d make you unable to see and correct those errors.”

The move to Puerto Rico brought her stint at Facebook to a close sooner than she had planned. Ms. Haugen said Facebook’s human resources department told her it couldn’t accommodate anyone relocating to a U.S. territory. In mid-April, she agreed to resign the following month.

Ms. Haugen continued gathering material from inside Facebook through her last hour with access to the system. She reached out to lawyers at Whistleblower Aid, a Washington, D.C., nonprofit that represents people reporting corporate and government misbehavior.

In addition to her coming Senate testimony and her SEC whistleblower claim, she said she’s interested in cooperating with state attorneys general and European regulators. While some have called for Facebook to be broken up or stripped of content liability protections, she disagrees. Neither approach would address the problems uncovered in the documents, she said—that despite numerous initiatives, Facebook didn’t address or make public what it knew about its platforms’ ill effects.

Mr. Stone of Facebook said, “We have a strong track record of using our research—as well as external research and close collaboration with experts and organizations—to inform changes to our apps.”

In Ms. Haugen’s view, allowing outsiders to see the company’s research and operations is essential. She also argues for a radical simplification of Facebook’s systems and for limits on promoting content based on levels of engagement, a core feature of Facebook’s recommendation systems. The company’s own research has found that “misinformation, toxicity, and violent content are inordinately prevalent” in material reshared by users and promoted by the company’s own mechanics.

“As long as your goal is creating more engagement, optimizing for likes, reshares and comments, you’re going to continue prioritizing polarizing, hateful content,” she said.

Read more here.