Tag Archives: Facebook

Facebook Might be Considering a Ban on Political Ads



Bloomberg reported that Facebook is considering imposing a ban on political ads on its social network in the days leading up to the November election. This is according to “people familiar with the company’s thinking”. I find this surprising.

Recently, an independent audit found Facebook lacking in several important areas. The summary of the audit pointed out many ways that Facebook needed to improve. Here are just a few points from the summary:

  •  The Auditors noted that Facebook’s definition of protecting free speech meant allowing harmful and divisive rhetoric that amplifies hate speech and threatens civil rights. “When it means that powerful politicians do not have to abide by the same rules as everyone else does, a hierarchy of speech is created that privileges certain voices over less powerful voices.”
  • The auditors have “grave concerns that the combination of the company’s decision to exempt politicians from fact-checking and the precedents set by its recent decisions on President Trump’s posts, leaves the door open for the platform to be used by other politicians to interfere with voting. If politicians are free to mislead people about official voting methods (by labeling ballots illegal or making other misleading statements that go unchecked for example) and are allowed to use not-so-subtle dog whistles with impunity to incite violence against groups advocating for racial justice, this does not bode well for the hostile voting environment that can be facilitated by Facebook in the United States.”

I want to believe that Facebook read the audit, and came away from it realizing that refusing to fact-check the posts made by politicians, and allowing politicians to break Facebook’s rules, was a bad idea. The company has done political ad blackouts before elections in other countries – but this would be the first time (to my knowledge) that Facebook is considering doing that before the U.S. 2020 election.

One thing to keep in mind is that Bloomberg is not reporting that this ad blackout is going to happen. It appears to be something Facebook is considering. That doesn’t mean it will become policy.

The other thing to consider is that, as Bloomberg points out, Facebook’s political advertising is a very small part of Facebook’s business. It is possible that Facebook is considering this small change – for a very limited time-span – so it can act like it did something in regards to the audit. Banning political ads is not going to undue the damage already done by politicians who post misinformation.


Facebook Failed its Civil Rights Audit



An independent audit of Facebook’s policies and practices was led by Laura W. Murphy, a civil rights and civil liberties leader, along with a team from civil rights law firm Relman Colfax, led by Megan Cacace.

The audit, which began in 2018 at the behest and encouragement of the civil rights community and some members of Congress, proceeded with Facebook’s cooperation. The purpose of the audit is to help Facebook identify, prioritize, and implement sustained and comprehensive improvements to the way it impacts civil rights.

At the start, the audit was to focus on voter suppression and voter information, building a civil rights accountability infrastructure, content moderation and enforcement (including hate speech and harassment), advertising targeting and practices, diversity and inclusion, fairness in algorithms, and the civil rights implications of privacy practices. They later added the topics COVID-19 and the 2020 census.

Here are some areas where Facebook failed its audit:

  •  In September of 2019, Facebook’s Vice President of Global Affairs and Communications, Nick Clegg, said that Facebook would continue to exempt politicians from its third-party checking program. He also announced the company had a standing policy to treat speech from politicians as newsworthy that should be seen and not interfered with by Facebook unless outweighed by the risk of harm.
  •  In October of 2019, Mark Zuckerberg gave a speech at Georgetown University in which he amplified his prioritization of a definition of free expression as a governing principle of Facebook. In the speech, he doubled down on the company’s treatment of politicians’ speech.
  •  The Auditors noted that Facebook’s definition of protecting free speech meant allowing harmful and divisive rhetoric that amplifies hate speech and threatens civil rights. “When it means that powerful politicians do not have to abide by the same rules as everyone else does, a hierarchy of speech is created that privileges certain voices over less powerful voices.”
  •  The audit summary points out that Facebook “has no qualms about reining in speech by the proponents of the anti-vaccination movement, or limiting misinformation about COVID-19, but when it comes to voting, Facebook has been far too reluctant to adopt strong rules to limit misinformation and voter suppression.”
  •  The summary also says: “Facebook’s failure to remove the Trump voting-related posts and close enforcement gaps seems to reflect a statement of values that protecting free expression is more important than other stated company values.”

The auditors have “grave concerns that the combination of the company’s decision to exempt politicians from fact-checking and the precedents set by its recent decisions on President Trump’s posts, leaves the door open for the platform to be used by other politicians to interfere with voting. If politicians are free to mislead people about official voting methods (by labeling ballots illegal or making other misleading statements that go unchecked for example) and are allowed to use not-so-subtle dog whistles with impunity to incite violence against groups advocating for racial justice, this does not bode well for the hostile voting environment that can be facilitated by Facebook in the United States.”


Facebook Removed Networks of Inauthentic Accounts



Facebook announced that it removed four separate networks for violating Facebook’s policy against foreign interference and coordinated inauthentic behavior.

Facebook removed 54 Facebook Accounts, 50 Pages, and 4 Instagram accounts that were involved coordinated inauthentic behavior in the United States. This network focused on domestic audiences.

The network used fake accounts to pose as residents of Florida, posted and commented on their own content to make it appear more popular than it is, and attempted to evade enforcement. Facebook said that several of these accounts had links to Proud Boys, which Facebook describes as “a hate group we banned in 2018.”

The Page admins and account owners posted about local politics in Florida, Roger Stone and his Pages, websites, books, and media appearances, a Florida land and water resources bill, the hacked materials released by Wikileaks ahead of the US 2016 election, candidates in the 2016 primaries and general election, and the Roger Stone Trial.

Facebook also removed 41 Facebook accounts, 77 Pages, and 56 Instagram accounts that originated in Canada and Ecuador and focused on El Salvador, Argentina, Uruguay, Venezuela, Ecuador, and Chile.

This network relied on a combination of authentic and inauthentic accounts. It activated around civic events such as elections, at times posting on both sides of the political debate, then abandoning or pausing its activity. They used fake accounts to drive people to off-platform sites and managed Pages posing as independent news in countries they targeted.

Facebook’s investigation of this group found links to political consultants and former government employees in Ecuador and Estraterra, a Canada-based PR firm.

Facebook removed 35 Facebook accounts, 14 Pages, 1 Group and 38 Instagram accounts that originated in Brazil and focused on domestic audiences. This network created fictitious personas posing as reporters, and managed Pages masquerading as news outlets. They posted about elections, political memes, and criticism of the political opposition.

Facebook’s investigation found links to individuals associated with the Social Liberal Party and some of the employees of the offices of Anderson Moraes, Alana Passos, Eduardo Bolsonaro, Flavio Bolsonaro, and Jair Bolsonaro.

Facebook also removed 72 Facebook accounts, 35 Pages, and 13 Instagram accounts that originated in Ukraine. This network focused on domestic audiences. They created fictitious personas, posted in Groups and Pages, commented on their own content, and evaded enforcement. Some of this network’s accounts had been removed for hate speech and impersonation. This network was active during the 2019 presidential and parliamentary elections in Ukraine.

Facebook’s investigation found links to Postmen DA, an advertising agency in Ukraine.


Facebook Won’t Share Ad Revenue with Australian News Organizations



Facebook has rejected a proposal by the Australian Competition & Consumer Commission (ACCC) to share advertising revenue with Australian news organizations, The Guardian reported. Facebook says there would “not be significant” impacts on its business if it stopped sharing news altogether.

The ACCC is an independent Commonwealth statutory authority whose role is to enforce the Competition and Consumer Act 2010 and a range of additional legislation, promoting competition, fair trading and regulating national infrastructure for the benefit of all Australians.

Based on this description, it seems to me that the ACCC is well within its rights to push Facebook to pay for using news content by sharing advertisement revenue with Australian news organizations.

Facebook, being what it is, has rejected the proposal. In its submission to the ACCC, Facebook said there was a “healthy rivalry” between itself and news organizations. It also said that it supported the idea of a code of conduct between digital platforms and news publishers, but that Facebook and Google were being “singled out” unfairly.

Personally, I find it hard to believe that Facebook (and Google) are being “singled out” unfairly by the ACCC’s proposal that the companies must start sharing ad revenue from the news websites that they each glean content (and revenue) from. This is nothing more than an excuse by Facebook, who clearly is trying to avoid the code of conduct it says it is in favor of.

In addition, The Guardian reported that Facebook said: “If there were no news content available on Facebook in Australia, we are confident the impact on Facebook’s community metrics and revenues in Australia would not be significant.”

To me, that sounds like Facebook is threatening to cut off Australian users from all news content. It also sounds sketchy. If the revenue Facebook gets by serving ads (attached to news content) to Australian Facebook users is so small – then it sounds like a company as big as Facebook could easily pay the news companies for that content.


Biden Campaign Asks Facebook to Strengthen Rules on Misinformation



The Joe Biden campaign posted an open letter to Facebook calling on Facebook to strengthen its rules on misinformation. As you may have guessed, Facebook has absolutely no interest in doing anything that would stem the tide of misinformation on its platform.

Joe Biden is the presumptive presidential nominee of the Democratic Party. The open letter was addressed to Mark Zuckerberg. Part of the letter says:

…We call for Facebook to proactively stem the tide of false information by no longer amplifying untrustworthy content and promptly fact-checking election-related material that goes viral.

We call for Facebook to stop allowing politicians to hide behind paid misinformation in the hope that the truth will catch up only after Election Day. There should be a two-week pre-election period during which all political advertisements must be fact-checked before they are permitted to run on Facebook.

And we call for clear rules – applied to everyone, including Donald Trump – that prohibit threatening behavior and lies about how to participate in the election…

Facebook responded with one paragraph that said:

“We live in a democracy, where the elected officials decide the rules around campaigns. Two weeks ago the President of the United States issued an executive order directing Federal agencies to prevent social media sites from engaging in activities like fact-checking political statements. This week, the Democratic candidate for President started a petition calling on us to do the exact opposite. Just as they have done with broadcast networks – where the US government prohibits rejecting politicians’ campaign ads – the people’s elected representatives should set the rules, and we will follow them. There is an election coming in November and we will protect political speech, even when we strongly disagree with it.”

The executive order Facebook referred to came after Twitter added a fact-check to two of President Trump’s tweets. Since then, Twitter marked one of Trump’s tweets as breaking Twitter’s rules about glorifying violence.

In short, we have two rich, white, men, both in their 70s, and both the presumptive presidential nominee for their party, arguing with social media giants.


Facebook’s Acquisition of GIPHY Questioned by Several Authorities



The UK’s Competition and Markets Authority (CMA) is investigating Facebook’s acquisition of GIPHY. The CMA is considering whether that transaction has resulted in the creation of a relevant merger situation under the merger provisions of the Enterprise Act 2002. If so, then the merger may be expected to result in a substantial lessening of competition within any market or markets in the United Kingdom for goods and services.

As you may recall, GIPHY was acquired by Facebook in May of 2020. At the time, Facebook stated that 50% of GIPHY’s traffic already came from Facebook’s family of apps, with half of that amount coming from Instagram alone.

Reuters reported that Facebook is now pausing the integration of GIPHY.

Reuters also reported that the CMA served Facebook with an initial enforcement order earlier this week and began the first stage of an investigation on Friday. That part of the investigation invites comments on the merger of Facebook and GIPHY from any interested party.

Politico reported that the U.S. Department of Justice and the U.S. Federal Trade Commission are also seeking to review the Facebook/GIPHY merger. It appears each wants to determine for themselves whether this merger should be allowed to go forward. The two agencies have not yet resolved a dispute over which of them should take on the task, according to Politico.

Earlier this month, the Australian Competition & Consumer Commission (ACCC) began an investigation of Facebook’s acquisition of GIPHY.

The ACCC is considering whether the acquisition:

  • provides Facebook with data that will strengthen its market power in any markets
  • provides Facebook with data about its social media and online private messaging rivals, that may lead to a substantial lessening of completion
  • could lead to the foreclosure of Facebook’s social media and online private messaging rivals

Mark Zuckerberg Defended Leaving Up Trump’s Posts



Facebook and Twitter are very different social media platforms. Recently, the differences have become vividly clear, as we see how each platform chooses how they will respond to controversial content posted by President Trump.

The Verge obtained a recording of an extended conference with employees in which Mark Zuckerberg addressed accusations that Facebook allowed election misinformation and veiled promotions of violence from President Trump. According to The Verge, Mark Zuckerberg stood by what he described as a “pretty thorough” evaluation of Trump’s posts. Zuckerberg reportedly said that the choice to avoid labeling them or removing them was difficult but correct.

As you may have heard, Twitter added a fact-check to two of President Trump’s tweets about mail-in ballots. Twitter also flagged another tweet made by President Trump because it violated Twitter’s rules about glorifying violence. It should be noted that all three of those tweets are still on Twitter. Those who want to read them can simply click a link to view them.

There has been some pushback. President Trump issued an executive order that some see as intended to curtail free speech on Twitter’s platform. Personally, I think that President Trump should have read Twitter’s policies about what is, and is not, allowed on their platform. If he had done that, and acted accordingly, there would be no need for that executive order.

The Guardian reported that Facebook staff held a virtual walkout to show their disagreement with Mark Zuckerberg’s decision regarding posts by President Trump. Some took to Twitter to express their displeasure. Facebook Software Engineer Timothy J. Aveni, resigned in response to Mark Zuckerberg’s decision to leave up Trump’s post that called for violence. The Hill reported that Owen Anderson, another Facebook employee, announced his departure from the company on Twitter.

Overall, I think that people who are fans of President Trump are going to take his side of the situation no matter what. Those who dislike Facebook and/or Mark Zuckerberg’s decision making process, might choose to leave that platform. Those angry with Twitter may quit that platform. None of this is going to lead to healthier versions of either Facebook or Twitter, and I miss the days before the politicians invaded social media.