Category Archives: Facebook

Facebook Removed Coordinated Inauthentic Behavior from China



Facebook has removed seven Pages, three Groups, and five Facebook accounts involved in coordinated inauthentic behavior as part of a small network that originated in China and focused on Hong Kong.

Facebook took these actions based on a tip shared by Twitter about activity they found on their platform. This led to Facebook doing its own investigation into suspected coordinated inauthentic behavior in the region and to identify activity.

The individuals behind this campaign engaged in a number of deceptive tactics, including the use of fake accounts – some of which had already been disabled by our automated systems – to manage Pages posing as news organizations, posts in Groups, disseminate their content, and also drive people to off-platform news sites. They frequently posted about local political news and issues including topics like the ongoing protests in Hong Kong. Although the people behind this activity attempted to conceal their identities, our investigation found links to individuals associated with the Chinese government.

Facebook released the following details:

  • Presence on Facebook: 5 Facebook accounts, 7 Pages and 3 Groups
  • Followers: About 15,500 accounts followed one or more of these Pages and about 2,220 accounts joined at least one of these Groups.

Facebook points out that they work to detect and stop this type of activity because they don’t want their services to be used to manipulate people. Facebook also makes it clear they took down these Pages, Groups, and accounts “based on their behavior, not the content they posted”. Personally, I’m not sure how one would separate the content from the behavior. Perhaps this is a disclaimer, of sorts, from Facebook.

Facebook also said the people behind this activity “coordinated with one another and used fake accounts to misrepresent themselves, and that was the basis of our action.”

It find it interesting that a tip from Twitter is what influenced Facebook to do their own investigation. I don’t think I’ve seen the big social media companies work together in this way before. It seems to me that the results were effective.


Facebook Wants to Keep Private Groups Safe



Facebook announced some changes that are designed to keep people safe within Facebook Groups. In part, Facebook will hold Group admins accountable for what is posted in their group.

Facebook says one way they keep people safe is by proactively identifying and removing posts and groups that break their rules. Facebook has been using AI and machine learning “to proactively detect bad content before anyone reports it, and sometimes before people even see it.”. It also uses human moderators.

Facebook has created new Group Privacy settings:

By default, a group that was formerly “secret” will now be “private” and “hidden”. A group that was formerly “closed” will now be “private” and “visible”. Groups that are “public” will remain “public” and “visible”.

Here are some factors Facebook considers when deciding if a Group should come down:

  • Does the name or description of the group include hate speech or other content Facebook doesn’t allow?
  • If group leaders often break Facebook’s rules, or if they commonly approve posts from other members who break Facebook’s rules, those are clear strikes against the overall group.
  • If a group member repeatedly violates Facebook’s standards, Facebook will start requiring admins to review their posts before anyone else can see them. Then, if an admin approves a post that breaks Facebook’s rules, it will count against the whole group.

It sounds like people who participate in Groups on Facebook really need to choose wisely when selecting admins. Facebook’s emphasis that their rules apply within Groups is likely going to deter those who have been de-platformed from other online spaces. I guess that’s one way to help keep people safe in Private Groups.


FTC Fines Facebook $5 Billion and Imposes New Privacy Policy



The Federal Trade Commission (FTC) announced that it has imposed a historic penalty $5 billion penalty and significant requirements on Facebook to boost accountability and transparency.

Facebook Inc. will pay a record-breaking $5 billion penalty, and submit to new restrictions and a modified corporate structure that will hold the company accountable for the decisions it makes about its user’s privacy to settle Federal Trade Commission charges that the company violated a 2012 FTC order by deceiving users about their ability to control the privacy of their personal information.

The FTC states that the $5 billion penalty against Facebook is the largest ever imposed on any company for violating consumers’ privacy and almost 20 times greater than the largest privacy or data security penalty ever imposed worldwide. It is one of the largest penalties ever assessed by the U.S. government for any violation.

The Department of Justice will file a complaint on behalf of the FTC alleging that Facebook repeatedly used deceptive disclosures and settings to undermine users’ privacy preferences in violation of its 2012 FTC order. These tactics allowed Facebook to share users’ personal information with third-party apps that were downloaded by Facebook “friends”. The FTC alleges that many were unaware that Facebook was sharing such information, and therefore did not take the steps needed to opt-out of sharing.

The FTC has also sued Cambridge Analytica, its former Chief Executive Officer Alexander Nix, and Aleksandr Kogan, an app developer who worked with the company, alleging they used false and deceptive tactics to harvest personal information from millions of Facebook users. Kogan and Nix have agreed to a settlement with the FTC that will restrict how they conduct any business in the future.

The FTC’s new 20-year settlement order with Facebook establishes an independent privacy committee of Facebook’s board of directors, “removing unfettered control by Facebook’s CEO Mark Zuckerberg over decisions affecting user privacy”. Members of the privacy committee will be independent and appointed by an independent nomination committee. Members can only be fired by a supermajority of the Facebook board of directors.

Facebook must designate compliance officers who will be responsible for Facebook’s privacy program. These officers are subject to the approval of the new board privacy committee and can only be removed by that committee. An independent third-party assessor will evaluate the effectiveness of Facebook’s privacy program and identify any gaps.

Facebook’s order-mandated privacy program also covers WhatsApp and Instagram. Facebook must conduct a privacy review of every new or modified product, service, or practice before it is implemented and document its decisions about user privacy. Facebook must share that with the CEO of the independent assessor and the FTC.

Other requirements include:

  • Facebook must exercise greater oversight over third-party apps, including by terminating app developers that fail to certify that they are in compliance with Facebook’s platform policies or fail to justify their need for specific user data
  • Facebook is prohibited from using telephone numbers obtained to enable a security feature (e.g. two-factor authentication) for advertising
  • Facebook must provide clear and conspicuous notice of its use of facial recognition technology, and obtain affirmative express user consent prior to any use that materially exceeds its prior disclosures to users
  • Facebook must establish, implement, and maintain a comprehensive data security program
  • Facebook must encrypt user passwords and regularly scan to detect whether any passwords are stored in plaintext
  • Facebook is prohibited from asking for email passwords to other services when consumers sign up for its services

Facebook Messenger Kids App Allowed Unauthorized Users



Facebook has quietly alerted parents about a bug in Facebook’s Messenger Kids app. A design flaw enabled users to sidestep the system that allows parents to limit who can connect with their children.

The Verge reported (and Facebook confirmed) that Facebook sent out an alert to parents.

Hi [Parent],
We found a technical error that allows [Child’s] friend [Friend] to create a group chat with [Child] and one or more of [Friend]’s parent-approved friends. We want you to know that we’ve turned off this group chat and are making sure that group chats like this one won’t be allowed in the future. If you have questions about Messenger Kids and online safety, please visit our Help Center and Messenger Kids parental controls. We’d also appreciate your feedback.

In short, The Verge reported that the child who initiated the group chat could invite friends who their parent approved of. The rest of the kids in the group chat could talk with each other, no matter whether or not their own parent’s had approved of the other children.

Messenger Kids is a free video calling and messaging app designed for kids to connect with close friends and family from their tablet or smartphone. The app’s description says: “Kids can only connect with parent-approved contacts, which creates a more controlled environment.” Group or one-on-one video calls with loved ones are more fun with interactive masks, reactions and sound effects.

Information about the Messenger Kids app stated that parents can control their child’s contact list and decide who can connect with their children. Messages in the app do not disappear and cannot be hidden. This enables parents to be able to read those messages.

At a glance, it sounds as though the Messenger Kids app, which was designed for children under the age of 13, had a reasonable amount of safety and privacy mechanisms built into it. Unfortunately, a bug that affected group chats has ruined that sense of safety.


Senate Hearing to Explore Facebook’s Libra Project



Well, that didn’t take long! The U.S. Senate Committee on Banking, Housing, and Urban Affairs will hold a hearing about Facebook’s Libra project on July 16, 2019. Reuters reported that the hearing will explore the Libra project as well as any data privacy considerations it may raise.

This comes after Senate Banking Committee members wrote to Facebook asking for information on rumors about its cryptocurrency project in May of 2019. The Committee wanted to know how Facebook would protect consumer information.

Senator Sherrod Brown (Democrat – Ohio), ranking member of the Senate Committee on Banking, Housing, and Urban Affairs wrote “This new cryptocurrency will give Facebook competitive advantages with regard to collecting data about financial transactions, as well as control over fees and functionality”. He continued:

“Facebook is already too big and too powerful, and it has used that power to exploit users’ data without protecting their privacy. We cannot allow Facebook to run a risky new cryptocurrency out of a Swiss bank account without oversight. I’m calling on our financial watchdogs to scrutinize this closely to ensure users are protected.”

In addition, Representative Maxine Waters, (Democrat – California) Chairwoman of the U.S. House of Representatives Committee on Financial Services posted a statement about Facebook’s cryptocurrency. She wrote “I am requesting that Facebook agree to a moratorium on any movement forward on developing a cryptocurrency until Congress and regulators have the opportunity to examine these issues and take action. Facebook executives should also come before the Committee to provide testimony on these issues.”

My best guess is that Facebook failed to consider that creating its own cryptocurrency could result in questions from legislators. Either that, or Facebook decided it was better to ask forgiveness than permission with Libra.


Facebook Announced Libra Cryptocurrency and Calibra Wallet



Facebook announced Libra, its very own cryptocurrency powered by blockchain technology. It is also introducing Calibra, a digital wallet for Libra. The wallet will be available in Messenger, WhatsApp and as a standalone app. Facebook expects to launch these products in 2020.

For many people around the world, even basic financial services are still out of reach: almost half of the adults in the world don’t have an active bank account and those numbers are worse in developing countries and even worse for women. The cost of that exclusion is high – approximately 70% of small businesses in developing countries lack access to credit and $25 billion is lost by migrants every year through remittance fees.

I see a problem. People who don’t have bank accounts might not be able to afford a smartphone to access Calibra and the Libra cryptocurrency on. I suspect Facebook is aiming mostly at businesses and not-so-much on people who are poor.

Facebook says that Calibra will let you send Libra “to almost anyone with a smartphone, easily and instantly as you might send a message, and at low to no cost.” In time, Facebook hopes to offer additional services for people and businesses, such as paying bills with the push of a button, buying a cup of coffee with the scan of a code, or riding your local public transit without needing to carry cash or a metro pass.

According to Facebook, Calibra will use the same verification and anti-fraud processes that banks and credit cards use. There will be automated systems that proactively monitor activity to detect and prevent fraudulent behavior. If someone gains access to your account and you lose some Libra as a result, Facebook will offer you a refund.

What about privacy? Facebook says Calibra will not share account information or financial data with Facebook or any third party without consumer consent. Personally, I wonder exactly how that consent will be given. Will users have the choice to opt-in to giving consent? Or will Calibra require that consent before people can use it?

Facebook also says Calbra customers’ account information and financial data will not be used to improve ad targeting on the Facebook family of products. Given Facebook’s history, it would be wise to be skeptical of that claim.


Facebook Plans to Launch GlobalCoin in 2020



Facebook is planning to launch GlobalCoin, its very own form of cryptocurrency, in about a dozen countries in 2020. Facebook wants to start testing GlobalCoin by the end of 2019.

Facebook wants to create a digital currency that provides affordable and secure ways of making payments, regardless of whether users have a bank account. According to the BBC, Facebook will join forces with banks and brokers that will enable people to change dollars and other international currencies into GlobalCoin. Facebook is also talking with money transfer firms like Western Union.

Personally, I can see plenty of problems with Facebook creating its own cryptocurrency. Facebook doesn’t have a good record of protecting people’s privacy or their data. If someone buys GlobalCoin, and their data or GlobalCoin account is hacked, I doubt Facebook is going to do anything about it. This whole things

feels like even more of a gamble than other types of cryptocurrency are.

What happens if a Facebook user buys GlobalCoin and then Facebook suspends that user’s account for breaking Facebook’s Terms and Policies? Does that person lose the GlobalCoin they paid for? If not, how would that user be able to access it without a Facebook account?

I’m not the only one with concerns. The U.S. Senate Committee on Banking, Housing, and Urban Affairs sent Mark Zuckerberg a letter with a bunch of questions about Facebook’s cryptocurrency.

Here are a few of the Committee’s questions:

  • What privacy and consumer protections would users have under the new payment system?
  • What consumer financial information does Facebook have that it has received from a financial company?
  • Does Facebook share or sell any consumer information (or information derived from consumer information) with any unaffiliated third parties?

Another huge problem for Facebook is that it will have to navigate the legislation that a multitude of countries have put in place regarding financial transactions. This is not going to be easy to do.