ABC News reported Facebook will walk back its block on Australian users sharing news on its site after the government agreed to make amendments to the proposed media bargaining laws that would force major tech giants to pay news outlets for their content.
This decision is a result of negotiations between the Australian Treasurer Josh Frydenberg and Facebook CEO Mark Zuckerberg. ABC News quoted Treasurer Josh Frydenberg as saying, “Mark Zuckerberg said to me today [restoring pages] will occur in coming days.”
Facebook updated its post on its Facebook Journalism Project (that was originally about the company’s decision to restrict the availability of Australian news on Facebook) with this:
“After further discussions with the Australian government, we have come to an agreement that will allow us to support the publishers we choose to, including small and local publishers. We’re restoring news on Facebook in Australia in the coming days. Going forward, the government has clarified we will retain the ability to decide if news appears on Facebook so that we won’t automatically be subject to a forced negotiation. It’s always been our intention to support journalism in Australia and around the world, and we’ll continue to invest in news globally and resist efforts by media conglomerates to advance regulatory frameworks that do not take account of the true value exchange between publishers and platforms like Facebook.”
Personally, I am skeptical of Facebook’s claim that it has always been their intention to support journalism in Australia. If it cared about supporting news publishers it would not have banned Australian News. That decision caused collateral damage as it also resulted in blocking Australian and local news to Fiji, Nauru, Papua New Guinea, Samoa, Tonga and Vanuatu.
Facebook’s decision also enabled anti-vaccine misinformation to spread widely since real Australian news organizations were unable to respond to and correct the misinformation in those posts. This happened at the very beginning of Australia’s vaccine rollout. In short, Facebook’s attempt to avoid paying for news may have resulted in vaccine hesitancy among some Australians.
Australia’s ABC News App has become extremely popular with Australians, who can no longer access local and country-wide news on Facebook. The app is a resource that connects users with news content created by the Australian Broadcasting Corporation. It is available on the App Store and Google Play.
As you may have heard, Facebook blocked all Australian news content from Australian news publishers. Facebook chose this drastic measure in an effort to avoid complying with legislation called the Australian Competition and Consumer Commission (ACCC) which would require platforms like Facebook to pay news organizations for their content.
Facebook got overzealous in deciding what to remove, and blocked Australian government accounts, state health departments, weather information, and even Facebook’s own Facebook page. Other collateral damage included blocking emergency services, public officials, food banks and charities. It is my understanding that some of that has been restored.
Financial Times’ Uma Patel tweeted: “ABC has used facebook’s ban to prompt visitors to download its app… it became the most downloaded app in Australia… although the next four are all owned by Facebook and the sixth is a company fb tried to buy.”
I think this is a good sign! Australians who have grown accustomed to scrolling through news on Facebook on their phone can replace that with ABC’s News App. I’m hoping that this inspires more news organizations to create their own news apps. The result could influence people to spend less time on Facebook.
There is another good reason for news organizations to make their own apps (or to advertise their existing ones). Facebook is likely to engage in the same shenanigans it imposed on its Australian users when other countries create legislation that is similar to Australia’s ACCC. When that happens, people will immediately be able to use their favorite news app to get their news.
Facebook has retaliated against the people of Australia by removing all content from Australian news publishers off of its platform. This is Facebook’s latest temper tantrum about an Australian law that would require Facebook (and Google) to pay news organizations for their content.
The law Facebook is angry about is called the Australian Competition & Consumer Commission (ACCC). It is a mandatory code that would cover issues like the sharing of data, ranking news content online and the sharing of revenue generated from the news. The law will be enforced through penalties and sanctions and will include a binding dispute resolution process.
Facebook posted the following on their Newsroom blog:
…Unfortunately, this means people and news organizations in Australia are now restricted from posting news links and sharing or viewing Australian and international news content on Facebook. Globally, posting and sharing news links from Australian publishers is also restricted. To do this, we are using a combination of technologies to restrict news content and we will have processes to review any content that was inadvertently removed…
In June of 2020, Facebook whined that it and Google were being “singled out” unfairly by this law. Facebook stated: “If there were no news content available on Facebook in Australia, we are confident that the impact on Facebook’s community metrics and revenues in Australia would not be significant.”
It seems to me that if Facebook believes that Facebook’s revenue would not significantly change by removing Australian news – it means Facebook can easily afford to pay for it.
NBC News reported that as of today, Australian users and publishers would not be able to post news content to its social network after the Australian government threatened to force it to pay news publishers. According to NBC News, Australian publishers will be restricted from sharing or posting content to their company pages. News publishers outside of Australia can still post articles, but Australians will not be able to view them.
According to NBC News, Google has decided that it will pay news publishers for their content. Google will remunerate French newspapers based on contributions to political and general information, daily volume of publications and monthly internet audience.
Germany’s Bundeskartellmt (which TechCrunch translates as Germany’s Federal Cartel Office), has initiated abuse proceedings against Facebook to examine the linkage between Oculus virtual reality products and the social network and Facebook platform.
Andreas Mundt, President of the Bundeskartllmt wrote:
“In the future, the use of the new Oculus glasses requires the user to also have a Facebook account. Linking virtual reality products and the group’s social network in this way could constitute a prohibited abuse of dominance by Facebook. With its social network Facebook holds a dominant position in Germany and is also already an important player in the emerging but growing VR (virtual reality) market. We intend to examine whether and to what extent this tying arrangement will affect competition in both areas of activity.”
In August, Facebook announced that it was changing the name of the VR business it acquired back in 2014 for around $2 billion – and had allowed to operate separately – to “Facebook Reality Labs,” signaling the assimilation of Oculus into its wider social empire, TechCrunch reported.
Also in August, Oculus announced that users would be required to log into Oculus with their Facebook accounts – beginning in October of 2020. Oculus users who did not have a Facebook account, and who did not want to make one, would eventually be unable to use Oculus.
TechCrunch reported that a Facebook spokesperson sent a statement. “While Oculus devices are not currently available for sale in Germany, we will cooperate fully with the Bundeskartellamt and are confident we can demonstrate that there is no basis to the investigation.”
We will have to wait and see what happens with Germany’s investigation into Facebook requiring Oculus users to have a Facebook account. Meanwhile, Oculus users in the United States, who want to continue using Oculus, are required to have a Facebook account. To me, it seems like if you want to use Oculus, you have to be tied to Facebook forever – or lose access.
The Federal Trade Commission (FTC) announced that it has sued Facebook. The FTC alleges that Facebook is illegally maintaining its personal social network monopoly through a years-long course of anticompetitive conduct. The lawsuit comes after a lengthy investigation in cooperation with a coalition of attorneys general of 46 states, the District of Columbia, and Guam.
The FTC is seeking a permanent injunction in federal court that could, among other things: require divestitures of assets, including Instagram and WhatsApp; prohibit Facebook from imposing anticompetitive conditions on software developers; and require Facebook to seek prior notice and approval for future mergers and acquisitions.
A separate lawsuit is led by New York Attorney General Letitia James, who stated that: “The lawsuit alleges that, over the last decade, the social networking giant illegally acquired competitors in a predatory manner and cut services to smaller threats – depriving users from the benefits of competition and reducing privacy protections and services along the way – all in an effort to boost its bottom line through increased advertising revenue.”
The Verge reported that this lawsuit centers on Facebook’s acquisitions, particularly its $1 billion purchase of Instagram in 2011. In addition to its acquisition strategy, the attorneys general allege that Facebook used the power and reach of its platform to stifle user growth for competing services. The Verge also reported that the FTC case cites Facebook’s decision to block Vine’s friend-finding feature after the Twitter acquisition as a particularly flagrant instance of this behavior.
To me, it seems like Facebook could potentially face some legal consequences as a result of one – or both – of these lawsuits. It will be interesting to see what would happen if Facebook is required to seperate itself from Instagram and WhatsApp. If Facebook is required to improve user privacy, I think many people would want to know the specific details about how it will do that.
Facebook has placed labels on content that includes misinformation about elections. The labels have been added to some of President Trump’s posts in which he made claims about the election that Facebook deemed to be false information. Unfortunately for Facebook (and its users), the labels did almost nothing to stop the spread of false information posted by President Trump.
BuzzFeed News reported that a Facebook employee asked last week whether Facebook had any data about the effectiveness of the labels. A data scientists revealed that the labels do very little to reduce the spread of false content.
The data scientist noted that adding the labels was not expected to reduce the spread of false content. Instead, they are used “to provide factual information in context to the post.” BuzzFeed News reported that the labels on President Trump’s posts (that contained false information) decreased reshares by about 8% and are among some of the posts that got the most engagement on the platform.
Why did that happen? The answer seems obvious, based on what BuzzFeed News reported. Facebook applied some labels to some of President Trump’s posts that contained misinformation about the election. It didn’t actually do anything to prevent users from liking or sharing those posts.
Twitter also applied labels to some of President Trump’s tweets that contained misinformation about elections. The addition of a label disables a user’s attempt to Retweet or Like those tweets. Users can Quote-Tweet them if they want to add their own commentary in regards to a specific labeled tweet.
On November 12, 2020, Twitter posted an update about their work regarding the 2020 U.S. Elections. In it, Twitter stated that they saw an estimated 29% decrease in Quote Tweets of the labeled tweets due in part to a prompt that warned people prior to sharing. In the same post, Twitter stated that they don’t believe that the Like button provides sufficient, thoughtful consideration prior to amplifying tweets.
I find it interesting that Twitter and Facebook appear to have entirely different ideas about what to do about election related content that is misinformation. Both applied labels, but Twitter took things a step further and disabled user’s ability to Like or Retweet those kinds of posts. Neither platform was 100% successful at stopping the spread of misinformation – but Twitter did a better job of it than Facebook.
In September, Facebook announced that it won’t accept political ads in the week before the US Election. Their ban on political ads would only affect the ones submitted after October 27, 2020.
Recently, Nick Clegg, Facebook’s vice president of global affairs and communication, told French Weekly Journal du Dimanche that a total of 2.2 million ads on Facebook and Instagram have been rejected, and 120,000 posts were withdrawn for attempting to “obstruct voting” in the upcoming US election. In addition, Facebook has been posting warnings on 150 million examples of false information that were on Facebook and Instagram
Facebook has been increasing its efforts to avoid a repeat of events leading up to the 2016 US presidential election, won by Donald Trump, when its network was used for attempts at voter manipulation carried out from Russia.
There were similar problems ahead of Britain’s 2016 referendum on leaving the European Union.
According to Nick Clegg, Facebook has thirty-five thousand employees taking care of the security of Facebook’s platforms and contribute for elections. The company also has partnerships with 70 specialized media, including five in France, on the verification of information. Facebook also uses artificial intelligence that Nick Clegg says has “made it possible to delete billions of posts and fake accounts, even before they are reported by users.”
It appears that Facebook is putting in some effort to remove political misinformation, and also to reject unacceptable political ads. To me, this is a starting point that should have begun before the US primary elections and caucuses. Waiting until right before Election Day to clean up its platforms is too late.