Facebook admitted that year it had failed to do enough to prevent the incitement of violence in Myanmar.
Facebook said it has banned all remaining accounts linked to the Myanmar military on Thursday, citing the junta’s use of deadly force against anti-coup demonstrators.
The move, which takes effect immediately, applies to the military and entities controlled by the armed forces on both Facebook and Instagram.
It also bans “military-linked commercial entities” from advertising on the platforms.
“Events since the February 1 coup, including deadly violence, have precipitated a need for this ban,” the social media giant said in a statement.
“We believe the risks of allowing the Tatmadaw on Facebook and Instagram are too great,” it added, using the Myanmar name for the country’s armed forces.
The junta has steadily increased its use of force against a massive and largely peaceful civil disobedience campaign demanding Myanmar’s army leaders relinquish power.
Three anti-coup protesters have been killed in demonstrations, while a man patrolling his Yangon neighbourhood against night arrests was also shot dead.
Facebook said its ban was intended to prevent Myanmar’s generals “from abusing our platform”.
The military has used Facebook to boost its claims that voter fraud marred an election last November after ousted civilian leader Aung San Suu Kyi’s party won in a landslide.
Since seizing power, the junta has arrested hundreds of anti-coup protesters, ordered nightly internet blackouts and banned social media platforms — including Facebook — in an effort to quell resistance.
Thursday’s announcement follows Facebook’s earlier decision to kick off a page run by the regime’s “True News” information service after the tech giant accused it of inciting violence.
Pages for government offices now run by the junta remain unaffected.
“This ban does not cover government ministries and agencies engaged in the provision of essential public services,” the company said. “This includes the Ministry of Health and Sport, and the Ministry of Education.”
In recent years, hundreds of army-linked pages have been blocked by Facebook after the social media giant came under heavy criticism for its ineffective response to malicious posts in the country.
Junta chief Min Aung Hlaing and other top brass were booted from the platform in 2018, a year after a military-led crackdown forced around 750,000 members of the Rohingya Muslim community to flee into neighbouring Bangladesh.
Facebook admitted that year it had failed to do enough to prevent the incitement of violence in Myanmar.
“We can and should do more,” Facebook executive Alex Warofka said at the time.
The suits allege Facebook sought to squelch competition by acquiring the messaging applications — Instagram in 2012 and WhatsApp in 2014.
United States federal and state antitrust enforcers filed suit against Facebook on Wednesday claiming the social media giant abused its dominant position and seeking to unwind its acquisitions of messaging services Instagram and WhatsApp.
Separate suits filed by the Federal Trade Commission and a coalition of state officials called for the divestment of Instagram and WhatsApp, services which are part of the Facebook “family” of applications.
“Facebook’s actions to entrench and maintain its monopoly deny consumers the benefits of competition,” said Ian Conner, director of the FTC’s Bureau of Competition.
“Our aim is to roll back Facebook’s anticompetitive conduct and restore competition so that innovation and free competition can thrive.”
A separate legal action was filed by antitrust enforcers from 48 US states and territories.
“For nearly a decade, Facebook has used its dominance and monopoly power to crush smaller rivals and snuff out competition, all at the expense of everyday users,” said New York state Attorney General Letitia James, who leads the coalition.
The action presages a fierce court battle seeking to force Facebook to divest the apps which have become an increasingly important element of the business model of the California giant and integrated into its technology.
Facebook said it would “vigorously” defend its actions and denied abusing its position.
“Antitrust laws exist to protect consumers and promote innovation, not to punish successful businesses,” Facebook general counsel Jennifer Newstead said in a statement.
“Instagram and WhatsApp became the incredible products they are today because Facebook invested billions of dollars, and years of innovation and expertise, to develop new features and better experiences for the millions who enjoy those products.”
Newstead added that these deals had been approved years ago by the FTC, which she said meant “the government now wants a do-over, sending a chilling warning to American business that no sale is ever final.”
Consumer harm? Some analysts argued the antitrust cases would have difficulty proving Facebook harmed consumers since its services are largely free.
Jessica Melugin of the Competitive Enterprise Institute libertarian think tank called the actions “political theater dressed up as antitrust law” and argued that “a billion consumers worldwide have benefited from Facebook’s purchase of Instagram and WhatsApp.”
Cleveland State University law professor Christopher Sagers said the case may have merit because Facebook “has been an unabashedly predatory and exclusionary bully in every sector it’s been involved in.”
But he also noted that “American antitrust law is now so hard to enforce in all cases, especially in cases like this, involving no conspiracy among competitors, and rather involving only one big firm’s unilateral conduct.”
Tiffany Li, a Boston University law professor who studies the sector, said that while Facebook has rivals bidding for internet users’ attention it has a big advantage because of its access to data.
“One company having exclusive ownership of vast amounts of user data, with no potential for interoperability or access to competitors, can be anti-competitive,” she said.
The FTC announced earlier this year it would review acquisitions made by five Big Tech firms over the past decade, opening the door to a wave of potential antitrust investigations.
The consumer protection agency said it would review deals made by Amazon, Apple, Facebook, Microsoft and Google parent Alphabet since 2010 amid growing complaints about tech platforms which have dominated key economic sectors.
The US Justice Department, which shares antitrust enforcement with the FTC, in October sued Google parent Alphabet, accusing the Silicon Valley giant of maintaining an “illegal monopoly” in online search and advertising and opening the door to a potential breakup. Eleven US states joined that case.
Facebook and other social media companies are being scrutinised over how they handle misinformation.
With just two months left until the United States presidential election, Facebook says it is taking more steps to encourage voting, minimise misinformation and reduce the likelihood of post-election “civil unrest”.
The company said on Thursday it will restrict new political advertisements in the week before the election and remove posts that convey misinformation about COVID-19 and voting.
It will also attach links to official results to posts from candidates and campaigns that declare premature victories.
“This election is not going to be business as usual. We all have a responsibility to protect our democracy,” Facebook CEO Mark Zuckerberg said in a post. “That means helping people register and vote, clearing up confusion about how this election will work, and taking steps to reduce the chances of violence and unrest.”
Facebook and other social media companies are being scrutinised over how they handle misinformation, given issues with President Donald Trump and other candidates posting false information and Russia’s continuing attempts to interfere in US politics.
Facebook has long been criticised for not fact-checking political ads or limiting how they can be targeted at small groups of people.
The US elections are just two months away, and with Covid-19 affecting communities across the country, I’m concerned…
With the nation divided, and election results potentially taking days or weeks to be finalised, there could be an “increased risk of civil unrest across the country”, Zuckerberg said.
In July, Trump refused to publicly commit to accepting the results of the upcoming election, as he scoffed at polls that showed him lagging behind Democratic rival Joe Biden.
Trump has also made false claims that the increased use of mail-in voting because of the coronavirus pandemic allows for voter fraud. That has raised concerns over the willingness of Trump and his supporters to abide by election results.
Asked in an interview on Media (known to Noble Reporters Media) aired on Thursday if he had personally engaged with Trump on his posts about voting, Zuckerberg said he did not think he had recently.
But Zuckerberg said he had had “certain discussions with him in the past where I’ve told him that I thought some of the rhetoric was problematic”.
Under the new measures, Facebook says it will prohibit politicians and campaigns from running new election advertisements in the week before the election. However, they can still run existing advertisements and change how they are targeted.
Posts with obvious misinformation on voting policies and the coronavirus pandemic will also be removed, the company said.
Users can only forward articles to a maximum of five others on Messenger, Facebook’s messaging app. The company will also work with the Reuters news agency to provide official election results and make the information available both on its platform and with push notifications.
After being caught off-guard by Russia’s efforts to interfere in the 2016 US presidential election, Facebook, Google, Twitter and other companies put safeguards in place to prevent it from happening again.
That includes taking down posts, groups and accounts that engage in “coordinated inauthentic behavior” and strengthening verification procedures for political advertisements.
Last year, Twitter banned political advertisements altogether, while Alphabet’s Google limited the ways in which election advertisers could micro-target voters.
Zuckerberg said Facebook had removed more than 100 networks worldwide engaging in such interference over the last few years.
“Just this week, we took down a network of 13 accounts and two pages that were trying to mislead Americans and amplify division,” he said.
But experts and Facebook’s own employees say the measures are not enough to stop the spread of misinformation – including from politicians and in the form of edited videos.
Facebook had previously drawn criticism for its advertisement policy, which cited freedom of expression as the reason for letting politicians like Trump post false information about voting.
Social media giant faces scrutiny after media reports revealed it ignored anti-Muslim hate speech by BJP leaders.
An Indian parliamentary committee has grilled Facebook representatives after the social media giant was accused of bias and not acting against anti-Muslim posts on its platform.
The closed-door hearing on Wednesday followed accusations in newspaper reports that the social media giant was allowing hate speech on its platform and that its top policy official in India had shown favouritism towards Prime Minister Narendra Modi’s Bharatiya Janata Party (BJP).
The social media giant has denied the allegations and the outcome of the hearing was unclear.
Facebook came under scrutiny after a series of reports by the United States-based Wall Street Journal (WSJ) showed the company ignored anti-Muslim hate speeches by BJP leaders while Facebook’s India policy chief, Ankhi Das, made decisions favouring Modi.
On Tuesday, New Delhi-based English daily the Indian Express reported that following a request from the party, Facebook removed pages critical of the BJP months before the 2019 general elections.
In email exchanges reported by the Express, the BJP had told Facebook the pages were “in violation of expected standards”, with posts that were “not in line with facts”.
Requests for comment from Media (known to Noble Reporters Media) to Facebook went unanswered.
India is Facebook’s biggest market with more than 300 million users while the company’s messaging app, WhatsApp, boasts 400 million users in the world’s second-most populous nation.
The BJP spends more than any political party in India on Facebook advertisements.
Dozens of Muslims have been lynched in the past six years by vigilantes, with many of the incidents triggered by fake news regarding cow slaughter or smuggling shared on WhatsApp.
The WSJ had reported last month that Das refused to apply the company’s hate speech policies to BJP politicians and other “Hindu nationalist individuals and groups”.
Facebook allowed anti-Muslim posts on its platform to avoid ruining the company’s relationship with the BJP, the WSJ said. Time Magazine made similar allegations last week.
Das last month apologised to Muslim staff for sharing a post that dubbed Muslims in India a “degenerate community”, according to a report by US media outlet BuzzFeed News.
Opposition attacks Facebook The Facebook deposition was originally slated for Tuesday but was deferred following the death of former Indian President Pranab Mukherjee.
The opposition Congress party said in a statement on Tuesday that there was a “blasphemous nexus between the BJP and Facebook”.
“The aim of the BJP is ‘divide and rule’ and the social media giant Facebook is helping them achieve this,” it said in the statement.
Opposition parliamentarian Derek O’ Brien, in a letter sent to Facebook CEO Mark Zuckerberg on Tuesday, also said there was “enough material in the public domain, including memos of senior facebook management (in India)” to show bias favouring the BJP.
Meanwhile, senior BJP leader and India’s communications minister Ravi Shankar Prasad claimed in a letter he sent to Zuckerberg that ahead of the 2019 national elections, “there was a concerted effort by Facebook … to not just delete pages or substantially reduce their reach but also offer no recourse or right of appeal to affected people who are supportive of right-of-centre ideology”.
Prasad also alleged in the letter that recent press reports were the result of “selective leaks … to portray an alternate reality”.
“This interference in India’s political process through gossip, whispers and innuendo is condemnable,” Prasad said.
Ajit Mohan, Facebook’s India chief, has defended the company’s actions and denied any bias. But the company also admitted it had to do better on tackling hate speech.
Right-wing bias? Facebook’s alleged favouritism towards India’s Hindu nationalists is not the first time the social media giant has been accused of tacitly supporting right-wing groups.
Last year, campaign group Avaaz said that the tech giant was failing to rein in a “tsunami” of hate posts inflaming ethnic tensions in India’s northeast state of Assam.
Avaaz said the dehumanising language – often targeting India’s Bengali Muslims – was similar to that used on Facebook about Myanmar’s mainly Muslim Rohingya before an army crackdown and ethnic violence forced 700,000 Rohingya to flee in 2017 to Bangladesh.
The platform has also come under fire in Myanmar over hate speech directed against the Rohingya over the past decade.
Investigators from the United Nations said Facebook played a key role in spreading hate speech that fuelled the violence.
The company admitted two years ago that it had been “too slow” to address the problem.
Also last month in the US, a Facebook engineer was reportedly fired for internal posts revealing that right-leaning groups and individuals in the US were given preferential treatment by preventing their posts from being removed, despite violating content rules.
Far-right news website Breitbart, non-profit group PragerU and Trump supporters Diamond and Silk, were some of the organisations and personalities favoured by Facebook, according to internal posts seen by Buzzfeed.
Platform, accused of fanning hate towards Rohingya, says it removed 280,000 posts that breached rules in second quarter.
Facebook said on Tuesday that it was preparing for Myanmar’s general election in November by improving the detection and removal of hate speech and content that incites violence, and preventing the spread of misinformation.
The company said in a blog that between now and November 22, it would remove “verifiable misinformation and unverifiable rumours” that are assessed as having the potential to suppress the vote or damage the “integrity” of the electoral process.
“For example, we would remove posts falsely claiming a candidate is a Bengali, not a Myanmar citizen, and thus ineligible,” Facebook said.
The platform has come under fire in Myanmar over hate speech directed against the mainly Muslim Rohingya over the past decade, including during the brutal military-led crackdown in 2017 that forced more than 730,000 Rohingya to flee the country. Investigators from the United Nations said Facebook played a key role in spreading hate speech that fuelled the violence.
The company admitted two years ago that it had been “too slow” to address the problem.
Facebook said it was working with two partners in Myanmar to verify the official Facebook pages of political parties. It now has three fact-checking partners in Myanmar: BOOM, AFP Fact Check and Fact Crescendo.
Violating standards It also said it introduced a new feature that limits the number of times a message can be forwarded to five.
The feature is now available in Myanmar and, over the next few weeks, will be made available to Messenger users worldwide, the company added in the blog.
This year’s elections in Myanmar, scheduled for November 8, will be the second since the generals who had led the country for decades ceded power while ensuring their continuing influence through a 25-percent quota of seats in parliament.
The first, in 2016, brought longtime pro-democracy campaigner Aung San Suu Kyi’s National League for Democracy (NLD) to power but, during the two-month campaign period, an estimated one million Rohingya were stripped of their right to vote.
Facebook and other social media platforms have faced criticism worldwide in recent years from activists, regulators and governments for the spread of misinformation, including during elections.
Australian government wants the social media giant to pay for local publishers’ content amid pressure from Murdoch.
Facebook Inc. plans to block people and publishers in Australia from sharing news, a move that pushes back against a proposed law forcing the company to pay media firms for their articles.
The threat escalates an antitrust battle between Facebook and the Australian government, which wants the social-media giant and Alphabet Inc.’s Google to compensate publishers for the value they provide to their platforms.
The legislation still needs to be approved by Australia’s parliament. Under the proposal, an arbitration panel would decide how much the technology companies must pay publishers if the two sides can’t agree.
Facebook said in a blog posting Monday that the proposal is unfair and would allow publishers to charge any price they want. If the legislation becomes law, the company says it will take the unprecedented step of preventing Australians from sharing news on Facebook and Instagram.
“This is a decision we’re making reluctantly,” said Campbell Brown, Facebook’s vice president of global news partnerships. “It is the only way to protect against an outcome that will hurt, not help Australia’s media outlets.”
Facebook is still working through the details of how it would block articles from being shared, she said.
‘Heavy-Handed Threats’ Responding to Facebook’s announcement, Australia Treasurer Josh Frydenberg said: “We don’t respond to coercion or heavy-handed threats wherever they come from.” Forcing digital platforms to pay for original content would help create “a more sustainable media landscape,” Frydenberg said in a statement.
The chairman of Australia’s competition regulator, Rod Sims, said Facebook’s threat was “ill-timed and misconceived.” The proposed legislation seeks to bring “fairness and transparency” to Facebook and Google’s relationships with Australian news businesses, Sims said in a statement.
Google has also raised alarms about Australia’s proposal. The measure “would force us to provide you with a dramatically worse” Google Search and YouTube, and “put the free services you use at risk in Australia,” Mel Silva, managing director of Google Australia and New Zealand, wrote in an open letter.
Media’s Struggles The Australian government has said it’s trying to level the playing field between the tech giants and a local media industry that’s struggling from the loss of advertising revenue to those companies. In May, for example, Rupert Murdoch’s News Corp. announced plans to cut jobs and close or stop printing more than 100 local and regional newspapers in Australia.
The Australian-born Murdoch has for years advocated that Facebook and Google pay for news articles that appear on their platforms. And News Corp. has lauded government efforts to force the two companies to pay for news.
Michael Miller, executive chairman of News Corp Australasia, was quoted widely as saying: “The tech platforms’ days of free-riding on other peoples’ content are ending. They derive immense benefit from using news content created by others and it is time for them to stop denying this fundamental truth.”
Yet Facebook’s decision to block news on its platform could prevent publishers from reaching a wider audience. In the first five months of 2020, the company said it sent 2.3 billion clicks from its News Feed to Australian news websites.
‘Insignificant’ Loss The decision could also limit the appeal of Facebook’s social-media platform to Australians who use it to read news. However, Brown said removing news articles from Facebook in Australia would be “insignificant” to its business because they are a small fraction of what users see.
Australia’s new rules are part of a global push by government agencies to regulate the tech giants. In some countries, officials are concerned not only that Facebook and Google are capturing much of the advertising dollars that have sustained journalism, but also with the types of articles getting shared. The stories that tend to go viral on Facebook are those that stoke emotion and divisiveness, critics argue.
In April, France’s antitrust regulator ordered Google to pay media companies to display snippets of articles. In June, Google said it would pay some media outlets that will be featured in a yet-to-be-released news service in Germany, Australia and Brazil.
Last October, Facebook introduced a separate news section, paying some publishers whose stories are featured. Brown declined to share numbers on the popularity of the Facebook News tab, but said nearly all of the readers are a new audience for publishers. Last week, Facebook said it plans to expand the news section to other markets globally.
Facebook condemned Saturday what it called an “extreme” ruling by a Brazilian Supreme Court judge ordering it to block the accounts of 12 high-profile allies of President Jair Bolsonaro, which it vowed to appeal.
Brazil’s Supreme Court is overseeing an investigation into allegations that members of the far-right president’s inner circle ran a social media campaign to discredit the court, as well as slander and threaten its judges.
As part of that probe, Justice Alexandre de Moraes ordered Facebook to suspend the accounts of 12 Bolsonaro allies, and Twitter another 16 accounts.
The US social media giants complied on July 25 — but initially only blocked visitors in Brazil from viewing the accounts.
The blocked users soon skirted the ban by telling their followers how to change their account settings to another country.
Moraes then ordered the US social media giants Thursday to enforce the suspension worldwide.
When Facebook did not initially comply, saying it would appeal to the full Supreme Court, Moraes fined the company 1.9 million reals ($365,000) and issued a summons for its top executive in Brazil, Conrado Lester.
“This new legal order is extreme, posing a threat to freedom of expression outside of Brazil’s jurisdiction and conflicting with laws and jurisdictions worldwide,” Facebook said in a statement.
“Given the threat of criminal liability to a local employee, at this point we see no other alternative than complying with the decision by blocking the accounts globally, while we appeal to the Supreme Court.”
The row comes as Facebook and Twitter face increasing pressure in the United States and around the world to act more aggressively against hate speech and false information on their platforms.
In Brazil, it is part of ongoing tension between Bolsonaro and the high court, which has also ordered a probe into allegations the president obstructed justice to protect members of his inner circle from police investigations.
The affected accounts include high-profile figures such as conservative former lawmaker Roberto Jefferson, business magnate Luciano Hang and far-right activist Sara Winter.
Mark Zuckerberg announces decision after backlash over decision to not moderate controversial messages posted by Trump.
Mark Zuckerberg, chief executive of Facebook, promised to review the social network’s policies that led to its decision to not moderate controversial messages posted by US president that appeared to encourage violence against those protesting police racism.
Protests continue over police brutality as several United States cities hold memorials to honour George Floyd, an unarmed Black man who was killed by police in Minneapolis, Minnesota, on May 25.
California governor has ordered the state yolice training programme to stop teaching officers how to use a hold that can block the flow of blood to the brain.
Seattle’s mayor has banned the city’s police force from using tear gas on protests.
Facebook chief Mark Zuckerberg has defended himself for not condemning remarks by U.S. President Donald Trump appearing to threaten violent retribution against demonstrators protesting the death of a black man at the hands of police, according to media reported on Tuesday.
In a tweet that was also visible on his Facebook page, Trump slammed demonstrators protesting the death of George Floyd in Minneapolis as “THUGS” and appeared to promote a violent response by saying, “when the looting starts, the shooting starts.”
That phrase was first used by a Miami police chief in 1967 to justify a violent crackdown on black neighborhoods.
Twitter placed a “public interest notice” on the post for violating the platform’s rules “about glorifying violence,” a move not mirrored by Facebook.
Online magazine The Verge reported that Zuckerberg held a long conference call with Facebook employees and addressed accusations that the social media platform allowed election misinformation and veiled promotions of violence from Trump.
Zuckerberg told employees he should have offered them more transparency, The Verge reported citing a recording of the meeting.
But he stood by what he termed a “pretty thorough” evaluation of Trump’s posts, saying the choice to avoid labeling or removing them was difficult but correct.
According to the recording, Zuckerberg described being upset by some of Trump’s recent posts, including the one regarding looting.
“(But) I knew that I needed to separate out my personal opinion … from what our policy is and the principles of the platform we’re running are.”
Several Facebook employees have resigned over the lack of action, NobleReporters learnt
One said publicly that the company would end up “on the wrong side of history.”
Mark Zuckerberg has revealed plans to use his platform to combat racial injustice as he spoke about racism and police brutality.
Mark Zuckerberg reveals plans to use Facebook to combat racial injustice
The Facebook founder made this known as he mourned the murder of, not just George Floyd, but also Breonna Taylor, Ahmaud Arbery and others.
Mark Zuckerberg reacted to the deaths of a number of African Americans resulting from police brutality and he disclosed the efforts he’s making to use his platform to combat racial injustice.
He condemned police brutality saying it “reminds us how far our country has to go to give every person the freedom to live with dignity and peace.” He wrote;
“The pain of the last week reminds us how far our country has to go to give every person the freedom to live with dignity and peace. It reminds us yet again that the violence Black people in America live with today is part of a long history of racism and injustice. We all have the responsibility to create change.
“We stand with the Black community — and all those working towards justice in honor of George Floyd, Breonna Taylor, Ahmaud Arbery and far too many others whose names will not be forgotten.
“To help in this fight, I know Facebook needs to do more to support equality and safety for the Black community through our platforms. As hard as it was to watch, I’m grateful that Darnella Frazier posted on Facebook her video of George Floyd’s murder because we all needed to see that. We need to know George Floyd’s name. But it’s clear Facebook also has more work to do to keep people safe and ensure our systems don’t amplify bias.
“The organizations fighting for justice also need funding, so Facebook is committing an additional $10 million to groups working on racial justice. We’re working with our civil rights advisors and our employees to identify organizations locally and nationally that could most effectively use this right now.
“I know that $10 million can’t fix this. It needs sustained, long term effort. One of the areas Priscilla and I have personally worked on and where racism and racial disparities are most profound is in the criminal justice system. I haven’t talked much about our work on this, but the Chan Zuckerberg Initiative has been one of the largest funders, investing ~$40 million annually for several years in organizations working to overcome racial injustice. Priscilla and I are committed to this work, and we expect to be in this fight for many years to come. This week has made it clear how much more there is to do.
“I hope that as a country we can come together to understand all of the work that is still ahead and do what it takes to deliver justice — not just for families and communities that are grieving now, but for everyone who carries the burden of inequality.”