In October, a former Facebook product manager named Frances Haugen testified before Congress and several European governmental bodies that the company was too large to monitor dangerous activity on its site and that it also monopolizes the data of billions of people. Just days after her testimony, Facebook, the world’s largest social media platform, rebranded itself as Meta. While this move may seem to have come out of left field, the UNC Hussman School of Journalism and Media’s Shannon McGregor and Daniel Kreiss, who study misinformation on social media and how technology impacts the public sphere, were not surprised.
Disillusioned after two years with Facebook, Haugen filed complaints with the Securities and Exchange Commission alleging that Facebook hid damaging internal research that included criminal activity being conducted through the platform. She then leaked documents to the Wall Street Journal showing that Facebook was aware of the negative effects of misinformation shared on the platform and its subsidiaries. She testified before Congress several times in October and November to contextualize those documents and suggest solutions for reform, including new leadership and drastic changes to the algorithms the company uses to order and present content to users.
Days later, on Oct. 28, Facebook CEO, chairman and controlling shareholder Mark Zuckerberg announced the company was rebranding and investing $10 billion over the next 10 years to bring every Facebook user’s life into virtual reality.
Will Facebook invest in people and systems to monitor its new “metaverse,” or will it repeat the same mistakes Haugen exposed? This is one of many questions raised by the company’s move, say Assistant Professor McGregor and Kreiss, Edgar Thomas Cato Distinguished Professor.
Is Facebook too big?
The breadth of Meta’s holdings should concern lawmakers and the public, according to Haugen. The company owns the four most-downloaded applications in the last decade — Facebook, Facebook Messenger, Instagram and WhatsApp — as well as the virtual reality company Oculus and Giphy, a GIF and animated sticker company. And it collects the data of billions of users, which gives Meta unprecedented access to the personal information and online behavior of around a third of the world’s population.
McGregor says her biggest concern about the Meta expansion is how it will generate profits from selling user data and from targeted advertising based on that data.
“Facebook is still going to have access to years of billions of users’ data, and they will find a way to monetize that in the metaverse, whether it’s targeted sales of their technologies or special experiences and offers from commercial businesses,” McGregor says.
McGregor says that Meta’s growth strategy, which included buying all its competitors, and, by extension, their users’ data, is also anchored in a lack of concern for the social ramifications of the platform.
“Facebook has pursued two strategies for expansion: The first is growth at all costs, which means they want to push into as many new markets as possible as quickly as possible. And the second strategy is monetizing new users through their data,” McGregor says. “What’s clear from the whistleblower’s testimony is that growth far outstretched Facebook’s ability to monitor its platforms.”
These growing pains had real-world consequences, as the platform lacked the work force and language competency to accurately monitor posts in other countries for dangerous content or misinformation, evidenced in the Wall Street Journal’s reporting on the documents Haugen provided.
The news outlet revealed that the rioters who attacked the U.S. Capitol building on Jan. 6 used Facebook and other methods to organize the gathering and that misinformation and violent threats were widespread and unflagged on the site.
Mexican drug cartels used Facebook to recruit hit men and then posted videos of executions as threats to those who opposed them.
Human traffickers in the Middle East used specific hashtags on Facebook and Instagram to advertise and post fake job listings to trick women into being trafficked under the premise of legitimate work.
“The core problem isn’t that Facebook allows targeted advertising or free speech,” says Kreiss. “It’s that politicians, fringe groups and criminals use Facebook to create content that spreads disinformation, or even illegal activity, and Facebook doesn’t have systems in place to differentiate between a regular life update and a black-market listing.”
The platform’s critics argue that Facebook should be regulated due to the rampant misuse and antitrust violations, but so far, no one has been successful.
Regulating a giant
U.S. government officials and private companies have repeatedly accused Facebook of antitrust violations, including a Nov. 4 lawsuit filed by a now-defunct start-up company called Phhhoto. The lawsuit alleges that Facebook began negotiations to enter a partnership with Phhhoto, then prolonged the deal long enough to copy features on Phhhoto’s application and manipulate search tools to ensure Phhhoto would not appear in the top search results. The lawsuit alleges that many features that originated on Phhhoto are familiar to current Instagram filters.
On Nov. 29, the United Kingdom’s Competition and Markets Authority ordered Facebook to sell Giphy after the regulatory body found that Facebook’s control of the company could impact social media competitors such as Snapchat or TikTok if they were denied access to Giphy’s millions of digital assets.
In the last year, the Federal Trade Commission and the attorneys general of almost all 50 states joined together to sue Facebook for antitrust violations, but both cases were dismissed.
These lawsuits are attempts to regulate Facebook using antitrust laws, which contend that companies cannot buy rivals with the express purpose of destroying the competition. Facebook’s critics argue that the company acquired other tech companies to monopolize the market, not to improve its portfolio. After these failed lawsuits, many are looking to the federal government to take on Facebook.
“One man is making decisions that impact billions of people, which supports the argument that Facebook holds a monopoly on social platforms,” says Kreiss. “The prospect of regulation from Congress is realistic but will likely be impeded by partisanship.”
Kreiss says members of the Democratic Party tend to view disinformation and hate speech as Facebook’s core problem, while Republicans tend to believe the problem is censorship and limiting free expression. The one area of consensus, he says, is the conclusion that unrestricted social media can be damaging to children, which might provide common ground for bipartisan legislation.
“Historically, legislators convene around the idea of protecting minors from harm, and the courts tend to grant greater latitude in those cases,” says Kreiss. “Whether or not the size of Facebook’s holdings would be addressed in that kind of legislation remains to be seen.”
Arguments against government regulation of Meta center on the difficulty of defining what “too big” means for social media. Currently, Facebook outranks other social media platforms, including TikTok and SnapChat, by more than a billion monthly active users, but those applications still have hundreds of millions of users. Facebook’s acquisition of Instagram and WhatsApp removed two of its closest rivals from the race for users, leaving only YouTube in close competition.
“We also need to consider if breaking up Facebook and its holdings into smaller units would have any effect or even result in greater misinformation,” says Kreiss. “As a whole, Facebook has enormous resources it can devote to fixing its problems, but if it’s split, they could argue they no longer have the bandwidth to monitor their users.”
McGregor points to another argument for government regulation of Meta: The company has become critical infrastructure in some countries. Twice in the last two months, Facebook, Instagram and WhatsApp have simultaneously crashed for hours at a time. In countries like Afghanistan and Myanmar, Facebook is the only free portal to the rest of the internet, and billions of users worldwide use WhatsApp to affordably communicate and send censorship-free messages.
“Facebook is a global company whose decisions affect the lives of billions of people in very palpable ways,” says McGregor. “When they positioned themselves as the internet provider for millions, they opened themselves up to the possibility of regulation.”
Notably, Haugen addressed the outages in her testimony on Oct. 4, saying, “Yesterday we saw Facebook taken off the internet. I don’t know why it went down, but I know that for more than five hours, Facebook wasn’t used to deepen divides, destabilize democracies and make young girls and women feel bad about their bodies.”
Into the “metaverse”
The company’s new name reflects its ambition to move beyond social media and into a virtual world.
The metaverse will include virtual and augmented reality and video components that allow users to “live” within a digital universe. It’s envisioned as a platform for working, socializing, playing and traveling, all virtually.
McGregor says that Facebook has been quietly building Meta’s groundwork for quite some time.
“Over the past several years, Facebook has ramped up its investments in hardware, including developing a line of Portal video-calling devices, releasing Oculus virtual reality headsets and launching Ray-Ban Stories, a type of smart glasses that capture photos and videos from the wearer’s perspective,” says McGregor. “It’s not a coincidence that Facebook is moving into the virtual reality space during a pandemic that made us homebound for months at a time.”
Kreiss says the announcement was likely timed to coincide with Haugen’s whistleblower testimony in an attempt to change the narrative around the company.
“The question is, can Facebook, or more aptly Meta, successfully rebrand and renew trust with their users and society?” asks Kreiss.