Category Archives: Facebook

Facebook Executives Dismissed Efforts to Make the Site Less Divisive



The Wall Street Journal has a very detailed article that examines why Facebook is such a toxic place to visit. Several efforts were made internally to make Facebook less divisive, but executives shut down or weakened those efforts.

Facebook had kicked off an internal effort to understand how its platform shaped user behavior and how the company might address potential harms. Chief Executive Mark Zuckerberg had in public and private expressed concern about “sensationalism and polarization.”

But in the end, Facebook’s interest was fleeting. Mr. Zuckerberg and other senior executives largely shelved the basic research, according to previously unreported internal document and people familiar with the effort, and weakened or blocked efforts to apply its conclusions to Facebook products.

Years ago, I quit using Facebook because it was making me irritated and unhappy. My relatives, most of whom do not share the same political views that I do, suddenly all seemed to feel the need to post angry and hurtful political posts. It was right around the 2016 U.S. presidential election between Barack Obama (Democrat) and Mitt Romney (Republican). Based on the information in The Wall Street Journal article, it appears that divisiveness was seen by Facebook as a “feature”, not a “bug”.

A 2018 presentation pointed out that Facebook’s algorithms “exploit the human brain’s attraction to divisiveness”. The presenter warned that if left unchecked, Facebook would feed users “more and more divisive content in an effort to gain user attention & time on the platform.” It appears that observation was ignored.

A Common Ground team proposed that conversations in groups, that had been derailed by hot-button issues, could be salvaged if a moderator moved that argument to a temporary subgroup. Another option was to tweak recommendation algorithms to suggest a wider range of Facebook groups than people would ordinarily encounter. Those ideas were not implemented.

An idea called “Sparing Sharing” would have reduced the spread of content disproportionately favored by “hyperactive users”. Its effects would be heaviest on content favored by users on the far-right and far-left. It would have given middle-of-the-road users more influence. It was dismissed because executives insisted the change would harm Girl Scouts who were trying to sell cookies.


Facebook Introduced Messenger Rooms



Facebook is rolling out Messenger Rooms. They also are expanding WhatsApp group calls, and adding new live video features for Facebook, Instagram, and Portal. Perhaps Facebook is trying to compete with Zoom, which has grown to 300 million users.

Messenger Rooms make it easy to spend quality time with friends, loved ones, and people who share your interests. Create a room right from Messenger or Facebook, and invite anyone to join your video call, even if they don’t have a Facebook account. Rooms will soon hold up to 50 people with no time limit.

Facebook states that when you create a room, you choose who can see and join it. You can remove people from the call and lock a room if you don’t want anyone else to enter. Messenger Rooms is rolling out to some countries this week and will expand to the rest in coming weeks.

The Verge points out that room calls are not end-to-end encrypted, but Facebook says it does not view or listen to calls.

Facebook is expanding WhatsApp group calls to enable group voice and video calls with up to eight people. Facebook reminds users that as before, WhatsApp calls are secured with end-to-end encryption so no one else can view or listen to your private conversation, not even WhatsApp. It seems to me that the smarter, safer, thing to do would be to use WhatsApp instead of Messenger Rooms.

On Instagram, you can now watch and comment on live videos from your desktop. And you can save your videos to IGTV, which Facebook has been pushing. Doing so will allow your video to stick around longer than the 24-hour time limit in Stories.

Soon, users will be able to go live on Portal to Facebook Pages and Groups. The Facebook Live Portal app already lets users broadcast to their own profile. Later this month, Facebook will make it possible for Portal users to share their broadcasts with their communities.

I’m not comfortable with Facebook, especially considering all the data it grabs from users. People who do not have a Facebook account, and who enter a Messenger Room, may be unaware that Facebook may start collecting data from them. A few years ago, Portal was actually spying on people, which should make people very hesitant to use it.


Australian Code of Conduct to Make Social Media Companies Pay for News



The Australian federal government has asked the Australian Competition and Consumer Commission (ACCC) to create a mandatory code of conduct that would require companies like Google and Facebook to pay media companies for news. This comes after the ACCC advised that reaching a voluntary agreement with the big social media companies to pay for content would be “unlikely”.

The mandatory code will cover issues including the sharing of data, ranking of news content online and the sharing of revenue generated from the news. It will be enforced through penalties and sanctions and will include a binding dispute resolution process.

A draft of the mandatory code of conduct will be released in July of 2020. The voluntary code of conduct negotiations were expected to run until November. But, the mandatory code of conduct is now being created in part because COVID-19 has “exacerbated financial woes within the media sector.”

The Guardian reported that the mandatory code of conduct would force Facebook and Google to pay news media for their content, advise news media in advance of algorithm changes that would affect content rankings, favor original source and new content in search page results, and share data with media companies. At least some of this was also in the voluntary version of the code of conduct.

I find this fascinating because, if the Australian mandatory code of conduct is put in place, it could set a precedent for other countries to make one of their own. Obviously, Google and Facebook will fight against this, as they are quite used to receiving plenty of content for free while paying the content creators little to nothing.

It has become common for people to seek news online rather than through a newspaper subscription. It seems only fair that news organizations should be financially compensated for their content that big social media companies financially benefit from.


Facebook will Remove Harmful Misinformation About COVID-19



Facebook announced that they are limiting misinformation about COVID-19. Facebook has directed over 2 billion people to resources from the WHO and other health authorities through their COVID-19 Information Center and pop-ups on Facebook and Instagram with over 350 million people clicking through to learn more.

The goal appears to be to prevent the spread of misinformation about COVID-19. To me, it makes sense to do this because some of the misinformation being spread around about this virus is dangerous. Facebook says that they have removed thousands of pieces of misinformation that could cause harm. Two examples of that misinformation are: drinking bleach cures the virus (it doesn’t), and physical distancing is ineffective at preventing the disease from spreading (in reality, physical distancing works very well).

We’re going to start showing messages in News Feed to people who have liked, reacted, or commented on harmful misinformation about COVID-19 that we have since removed. These messages will connect people to COVID-19 myths debunked by the WHO including ones we’ve removed from our platform for leading to imminent physical harm. We want to connect people who may have interacted with harmful misinformation about the virus with the truth from authoritative sources in case they see or hear these claims again off of Facebook. People will start seeing these messages in the coming weeks.

To help people find the facts about COVID-19, Facebook has enlisted the help of over 60 fact-checking organizations that review and rate content in more than 50 languages around the world. Facebook added eight new fact-checking partners since the beginning of March, including MyGoPen in Taiwan, the AFP and dpa in the Netherlands, Reuters in the UK, and others.

The Guardian clarifies that Facebook’s new policy applies only to misinformation that Facebook considers likely to contribute to “imminent physical harm”, such as claims about “cures” or statements that physical distancing is not effective. However, Facebook is not taking down other misinformation about COVID-19, such as conspiracy theories about the virus’s origins.

In my opinion, people who go online and attempt to convince frightened people to drink bleach should be held accountable for their actions. Removing harmful misinformation like that is a good start, but Facebook should also take away the accounts of the people who are spreading harmful misinformation during a pandemic.


Facebook to Award $100 M in Cash Grants to Small Businesses



Facebook’s Chief Operating Officer Sheryl Sandberg posted on Facebook about the company’s plans to help small businesses that have been affected by COVID-19. In the post, she stated that the economic disruption poses a severe risk to small businesses. Facebook will help by awarding $100 million in cash grants and ad credits to small businesses.

…We’ve listened to small businesses to understand how we can best help them. We’ve heard loud and clear that financial support could enable them to keep the lights on and pay people who can’t come to work. That’s why today I’m announcing that Facebook is investing $100 million to help 30,000 small businesses in over 30 countries where our employees live and work…

Reuters reported that companies will be able to use cash to pay rent, cover operational costs, or run advertising on Facebook. Reuters also pointed out that Facebook will disclose more details soon about how businesses will be able to apply for assistance.

The Facebook for Business website now has a Boost With Facebook page. Small business owners who want to obtain a grant from Facebook can sign up for updates through that website. Facebook will begin taking applications in the coming weeks.

I think this is an excellent idea. Small businesses who receive a grant can make use of that money to keep their business running during the COVID-19 pandemic. The best part is that grants do not accrue interest and are not something that the small businesses who are eligible for this program will have to pay back to Facebook later on.


Facebook and Twitter Removed Accounts with Ties to Russia’s IRA



Facebook and Twitter have both stated that they have removed accounts that were operating out of Ghana and Nigeria, and that had ties to Russia’s IRA. This comes after CNN’s investigation uncovered activity that “had striking similarities to the Russian troll campaign of 2016, which created hundreds of accounts designed to pass as American”.

According to CNN, Facebook and Twitter had already been looking into some of the troll accounts when CNN notified the two companies of their investigation. Facebook announced:

Today, we removed 49 Facebook accounts, 69 Pages and 85 Instagram accounts for engaging in foreign interference – which is coordinated inauthentic behavior on behalf of a foreign actor – on Facebook, Instagram, and other internet platforms. This network was in the early stages of building an audience and was operated by local nationals – some wittingly and unwittingly – in Ghana and Nigeria on behalf of individuals in Russia. It targeted primarily the United States.

Facebook stated that they detected this network as a result of their internal investigation into suspected coordinated inauthentic behavior ahead of US elections. They note that their assessment benefited from their subsequent collaboration with a team of journalists at CNN. Facebook said it shared information with their industry peers, policy makers, and law enforcement.

Twitter, as you might expect, posted a thread of tweets about about the situation. The thread started with: “Our top priority is keeping people safe. In collaboration with law enforcement, industry peers, journalists, and expert researchers, we recently suspended a small network of accounts largely Tweeting in English and that presented themselves as based in the United States.”

The next tweet in the thread said: “These 71 removed accounts, operating out of Ghana and Nigeria and which we can reliably associate with Russia, attempted to sow discord by engaging in conversations about social issues, like race and civil rights.”

It would be smart to keep this in situation in mind as you use Facebook or Twitter. There is no logical reason to assume that every account you see is authentic. If you read or watch something on social media that causes you to feel angry or outraged, please wait a few minutes before sharing it. The account it came from just might be a troll – hoping to affect your emotional state so much that you share the content as quickly as possible. Don’t help the trolls!


Facebook Will Allow Bloomberg’s Political Memes



Yesterday, The New York Times reported that Mike Bloomberg is working with Meme 2020 for the purpose of having the company make memes that support Bloomberg’s presidential campaign. Today, The Verge reported that Facebook will allow the memes for political campaigns, so long as the posts are clearly identified as ads. The memes will not be placed in Facebook’s political Ad Library.

“Branded content is different from advertising, but in either case we believe it’s important people know when they’re seeing paid content on our platforms,” a Facebook spokesperson told The Verge. “We’re allowing US-based political candidates to work with creators to run this content, provided the political candidates are authorized and the creators disclose any paid partnerships through our branded content tools.”

Personally, I wouldn’t have guessed that out of all the people who are running for president it would be Mike Bloomberg who decided to pay influencers to make memes about him. I don’t think there are too many 77-year-olds who understand what memes are, how fast they spread, or what they mean. I’d like to hear the story of how Bloomberg came to the conclusion that memes were exactly what his campaign needed.

But, that’s not the weirdest thing about this situation. According to The Verge, The Meme 2020 project is part of Jerry Media, the promoter behind the infamous Fyre Festival. Meme 2020 is led by Mick Purzycki, the executive director of Jerry Media. What could possibly go wrong?