By B.N. Frank
Facebook (now Meta), its owners, and its employees have been accused of all kinds of terrible things. Sadly, it’s not the first time the company has been sued for “genocide complicity and endangering humanity.”
From Ars Technica:
“Defective design” —
Landmark $150B lawsuit seeks to hold Facebook accountable for Rohingya genocide
Facebook knew it was fanning flames of violence, did nothing, lawsuit alleges.
Rohingya refugees have filed a lawsuit against Meta, formerly known as Facebook, for its alleged role in the ethnic cleansing currently underway in Myanmar, sometimes known as Burma. The lawsuit says the social media giant is on the hook for “at least $150 billion” for “wrongful death, personal injury, pain and suffering, emotional distress, and loss of property.”
This lawsuit claims that Meta’s Facebook product is defective and that the company acted negligently. The complaint was filed this week in San Mateo County Superior Court, the jurisdiction in which Meta is headquartered, on behalf of a Rohingya refugee living in Illinois. It’s seeking class-action status to encompass all of the more than 10,000 Rohingya refugees who have resettled in the US since 2012.
Further Reading
Facebook “is tearing our societies apart,” whistleblower says in interview
The lawsuit is among the first to leverage allegations made by former Facebook employees and whistleblowers, including Frances Haugen, who shared over 10,000 documents with Congress and the Securities and Exchange Commission.
“Facebook executives were fully aware that posts ordering hits by the Myanmar government on the minority Muslim Rohingya were spreading wildly on Facebook,” one former Facebook employee said in a whistleblower complaint that was cited by the new lawsuit. “The issue of the Rohingya being targeted on Facebook was well known inside the company for years.”
“Textbook example of ethnic cleansing”
The Rohingya have been subjected to state-sanctioned harassment for years. A Muslim minority in a Buddhist-majority nation, the Rohingya people have been denied citizenship and falsely accused of being foreign invaders or criminals. In 2017, Myanmar’s military razed villages and carried out a campaign of rape and murder. Nearly 7,000 Rohingya people were killed, Doctors Without Borders estimated, and many of the 750,000 who fled now live in squalid conditions in refugee camps. The UN high commissioner for human rights called it “a textbook example of ethnic cleansing.”
The lawsuit claims that Facebook played a key role in stoking ethnic animus and facilitating the ruling junta’s ethnic cleansing.
“While the Rohingya have long been the victims of discrimination and persecution, the scope and violent nature of that persecution changed dramatically in the last decade, turning from human rights abuses and sporadic violence into terrorism and mass genocide,” the lawsuit says. “A key inflection point for that change was the introduction of Facebook into Burma in 2011, which materially contributed to the development and widespread dissemination of anti-Rohingya hate speech, misinformation, and incitement of violence.”
Facebook used its Internet.org Free Basics program to give people in Myanmar access to the Internet—provided they signed up for Facebook. Users received Internet niceties like weather and local news without incurring charges on their mobile phone plans, but they essentially lived in Facebook’s walled garden. For many, it was their first exposure to the Internet.
“Crisis of digital literacy”
“This resulted in a ‘crisis of digital literacy,’” the lawsuit says, “leaving these new users blind to the prevalence of false information online. Facebook did nothing, however, to warn its Burmese users about the dangers of misinformation and fake accounts on its system or take any steps to restrict its vicious spread.”
The ruling junta took advantage of that, the lawsuit alleges. “The brutal and repressive Myanmar military regime employed hundreds of people, some posing as celebrities, to operate fake Facebook accounts and to generate hateful and dehumanizing content about the Rohingya,” it says. “So deep was Facebook’s penetration into daily life in Burma and its role in the out-of-control spread of anti-Rohingya content, that Marzuki Darusman, chairman of the U.N. Independent International Fact-Finding Mission on Myanmar, described Facebook as having played a ‘determining role’ in the genocide.”
Facebook, the lawsuit claims, was well aware of the problems. “Despite having been repeatedly alerted between 2013 and 2017 to the vast quantities of anti-Rohingya hate speech and misinformation on its system, and the violent manifestation of that content against the Rohingya people, Facebook barely reacted and devoted scant resources to addressing the issue.”
It wasn’t until 2018, after a report from the UN documented the social network’s role in the violence, that the company took action, the lawsuit says. “We agree that we can and should do more,” said Alex Warofka, a product policy manager at Facebook.
Ignored warnings?
“They were warned so many times,” David Madden, a tech entrepreneur who worked in Myanmar, told Reuters in 2018. Madden said he gave a talk at Facebook’s headquarters in 2015 in which he warned the company that its site was being used to spread hatred. “It couldn’t have been presented to them more clearly, and they didn’t take the necessary steps,” he said.
A Meta spokesperson told Ars:
We’re appalled by the crimes committed against the Rohingya people in Myanmar,” “We’ve built a dedicated team of Burmese speakers, banned the Tatmadaw [the Burmese armed forces], disrupted networks manipulating public debate and taken action on harmful misinformation to help keep people safe. We’ve also invested in Burmese-language technology to reduce the prevalence of violating content. This work is guided by feedback from experts, civil society organizations and independent reports, including the UN Fact-Finding Mission on Myanmar’s findings and the independent Human Rights Impact Assessment we commissioned and released in 2018.
Defective product claim
Further Reading
Snap suspends Yolo, LMK anonymous messaging apps after lawsuit over teen’s death
The lawsuit alleges that Facebook is liable because it created a defective product, a relatively new legal argument that seeks to get around the company’s protections under Section 230. Facebook’s ranking algorithm, the lawsuit alleges, spurred violence against the Rohingya, “precisely the kind of harm that could have been reasonably expected from Facebook’s propagation and prioritization of anti-Rohingya hate speech and misinformation on its system.”
Furthermore, the lawsuit says that Facebook was negligent in its duty to “use reasonable care to avoid injuring others.” The company, the lawsuit alleges, “breached this duty by—among other things—negligently designing its algorithms to fill Burmese users’ News Feeds (especially users particularly susceptible to such content) with disproportionate amounts of hate speech, misinformation, and other content dangerous to Plaintiff and the Class.”
Activist Post reports regularly about Facebook and unsafe technology. For more information, visit our archives.
Image: Pixabay
Become a Patron!
Or support us at SubscribeStar
Donate cryptocurrency HERE
Subscribe to Activist Post for truth, peace, and freedom news. Follow us on Telegram, HIVE, Flote, Minds, MeWe, Twitter, Gab and What Really Happened.
Provide, Protect and Profit from what’s coming! Get a free issue of Counter Markets today.
Be the first to comment on "$150B Lawsuit Filed Against Facebook for “its alleged role in the ethnic cleansing currently underway in Myanmar”"