In March of this year, as you may recall, Facebook announced that it stored hundreds of millions of user passwords in plain text. At the time, Facebook said it would notify “hundreds of millions of Facebook lite users, tens of millions of other Facebook users, and tens of thousands of Instagram users” about this.
On April 18, 2019, Facebook made an update to their original Facebook Newsroom post titled “Keeping Passwords Secure” (which was originally posted on March 21, 2019).
Here is what was added:
Since this post was published, we discovered additional logs of Instagram passwords being stored in readable format. We now estimate that this issue impacted millions of Instagram users. We will be notifying these users as we did the others. Our investigation determined that these stored passwords were not internally abused or improperly accessed.
Personally, I’m wondering just what is going on at Facebook (and Instagram) that is causing it to collect and store user’s passwords in plain text. That’s an obvious safety concern. The number of unencrypted Instagram passwords has jumped from tens of thousands to millions. It is disturbing that Facebook misreported that number.
Not all passwords were stored unencrypted, but millions of passwords were. Why is that happening? To me, it sounds like passwords are not automatically being stored in plain text. If that were the case, then all user’s passwords would have been stored unencrypted. Something, or someone, appears to be selecting certain passwords to store improperly.
Ironically, the original blog post (before Facebook added an update) recommends that users affected by this security issue change their passwords, and to pick strong and complex passwords. That is good advice in general, but I don’t think doing so will protect users from having their unencrypted passwords stored on Facebook’s and Instagram’s servers.
It feels like we are hearing about Facebook doing nefarious things with people’s data at least once a week. The latest news comes from Business Insider which reported that Facebook harvested the email contacts of 1.5 million users without their knowledge or consent when they opened their accounts.
Business Insider has learned that since May 2016, the social networking company has collected the contact lists of 1.5 million users new to the social network. The Silicon Valley company says they were “unintentionally uploaded to Facebook,” and is now deleting them.
A security researcher noticed that Facebook was asking some users to enter their email passwords when they signed up for new accounts. This was supposedly to verify their identity. To be clear, Facebook wasn’t content with having a new user’s email address – it also wanted the password to that user’s email address.
Business Insider checked this out, and found that if you did enter an email password, a message popped up saying it was “importing” your contacts. Facebook did not ask user’s for permission to do that – it just went ahead and grabbed that information.
A Facebook spokesperson gave a statement to Business Insider. In it, Facebook claims that the contacts were not shared with anyone and that Facebook is now deleting them and notifying people whose contacts were imported. The statement does not say that Facebook is deleting the email passwords that it required new users to give them.
Personally, I find this disgusting. It seems like Facebook feels entitled to grab as much data as it can not only from its users – but also from people who are in the process of signing up for a Facebook account. When it gets caught doing this, it claims this was done “unintentionally.”
I find it hard to believe that someone unintentionally created something that would suck up people’s email contacts. I find it even harder to believe that the thing that sucks up contacts was unintentionally implemented as part of Facebook’s sign up process.
Facebook created Watch Party as a way for people to watch videos on Facebook together in real time. It was intended to turn watching videos into a social activity. According to Business Insider, Watch Party is popular with pirates, who use it to run illegal movie marathons.
The intended purpose of Watch Party was to allow Facebook users to host a video-watching party with friends. Everyone involved can watch the video simultaneously and comment or react in real time to what they are watching together. People could use Watch Time to watch funny videos together, or to watch a video of a family member’s graduation together.
Business Insider found that pirates are using Watch Party in a way does not appear to be legal. Instead of hosting content that they own, or that is legally free to view, they are watching copyrighted content like movies and TV shows.
We found that illicit watch parties were a frequent occurrence on the social network, broadcasting a range of media, from relatively recent hits like “Her” to cinematic classics like “Mean Girls” and vintage TV shows like the original “Twilight Zone”.
Business Insider noted that this type of copyright infringement has, in the past, been a solitary activity. Someone illegally downloads a movie and watches it by themselves. Watch Party enables pirates to gather together to watch an illegally downloaded movie.
Personally, I don’t think Facebook takes the time to really consider how a new feature could be used by nefarious people. They just launch something, assuming that everyone on Facebook will, of course, use the feature the way Facebook intended it to be used. This leaves Facebook scrambling to stop people from doing things like using Watch Party to watch pirated movies together.
It seems that Facebook cannot prevent itself from causing security and privacy problems. According to KrebsOnSecurity, hundreds of millions of Facebook users had their account passwords stored in plain text and searchable by thousands of Facebook employees.
A anonymous Facebook insider talked with Brian Krebs. The insider said Facebook is still trying to determine how many passwords were exposed, and for how long. So far, the investigation has uncovered archives with plain text user passwords dating back to 2012.
KrebsOnSecurity also spoke with Facebook software engineer Scott Renfro. He said that the issue first came to light in January of 2019 when security engineers reviewing some new code noticed passwords were being inadvertently logged in plain text.
Facebook sent a written statement to KrebsOnSecurity, in which Facebook said it intends to notify “hundreds of millions of Facebook lite users, tens of millions of other Facebook users, and tens of thousands of Instagram users.”
Facebook posted information on Facebook Newsroom titled: “Keeping Passwords Secure”. In it, Facebook acknowledges that, during a routine security review in January, they found some user passwords were being stored in a readable format within their internal data storage systems. Facebook says these passwords were never visible to anyone outside of Facebook.
The information from Facebook describes how they protect people’s passwords, and provides some suggestions for securing your Facebook and Instagram accounts. Personally, considering all the security and privacy issues that Facebook has faced, the most secure thing to do would be to delete your Facebook account.
Facebook announced a plan to tackle vaccine misinformation on Facebook by reducing its distribution and providing people with authoritative information on the topic.
Here is what Facebook plans to do:
- They will reduce the ranking of groups and Pages that spread misinformation about vaccinations in News Feed and Search. These groups and Pages will not be included in recommendations or in predictions when you type into Search.
- When Facebook finds ads that include misinformation about vaccinations, they will reject them. Facebook also removed targeting options like “vaccine controversies.” For ad accounts that continue to violate Facebook’s policies, Facebook may take further action, such as disabling the ad account.
- Facebook won’t show or recommend content that contains misinformation about vaccinations on Instagram Explore or hashtag pages.
- Facebook is exploring ways to share educational information about vaccines when people come across misinformation on this topic.
How will this work? Facebook points out that leading global health organizations, such as the World Health Organization and the US Centers for Disease Control and Prevention, have publicly identified verifiable vaccine hoaxes. If those hoaxes appear on Facebook, then Facebook will take action against them.
If a group or Page admin posts this vaccine misinformation, Facebook will exclude the entire group or Page from recommendations, reduce these groups and Pages’ distribution in News Feed and Search, and reject ads with this misinformation.
In addition, Facebook is going to provide people with additional context, so they can decide whether to read, share, or engage in conversations about information they see on Facebook. They are currently exploring ways to give people the more accurate information from expert organizations about vaccines at the top of results for related searches, on Pages discussing the topic, and on invitations to join groups about the topic.
BuzzFeed reported that Facebook will use machine learning and manual human review to identify and reduce specific kinds of anti-vax misinformation (such as the hoax that vaccines cause autism). Facebook will use these tools inside closed groups that are typically preferred by anti-vaxxers.
This might be the first time Facebook has done something that I am happy about! The first step toward reducing the spread of measles and other preventable diseases is to stop the spread of misinformation about vaccines.
Edison Research and Triton Digital posted “The Infinite Dial 2019”. It is the latest report in a series dating back to 1998 that covers consumer usage of media and technology and has tracked many new mediums as they develop. The Infinite Dial is the “gold standard” of nationally representative survey research.
Regarding social media, the latest study finds the number of current Facebook users continues to drop. The study shows an estimated 15 million fewer users of Facebook than in the 2017 report. The declines are heavily concentrated among younger people.
Marketplace’s Kimberly Adams interviewed President of Edison Research Larry Rosin. She pointed out the 15 million fewer Facebook users in the U.S. today than in 2017, and asked if that was a meaningful drop for Facebook. Larry Rosin responded:
I don’t see how you couldn’t say it’s a meaningful drop. Fifteen million is a lot of people, no matter which way you cut it. It represents about 6 percent of the total U.S. population ages 12 and older. What makes it particularly important is if it is part of a trend. This is the second straight year we’ve seen this number go down. Obviously, the U.S. is the biggest market, in terms of dollars,and it’s going to be a super important market for Facebook or anybody who’s playing this game.
Here is what else the study found:
- More than half the U.S. population now reports having used YouTube specifically for music in the last week. This number is now 70% among 12-34-year-olds.
- One-third of the population reported having listened to a podcast in the last month, representing 90 million monthly listeners. The spoken-word audio sector also saw increases with audiobooks, as the portion of the U.S. population that has ever listened to an audiobook surpasses one-half for the first time.
- The percentage of Americans who listen to online audio (defined as listening to AM/FM radio stations online and/or listening to streamed audio content available only on the internet) has doubled since 2012, growing from one-third of the population to two-thirds.
- Time spent listening to online audio has reached a record high this year, with weekly online audio listeners reporting an average of nearly 17 hours of listening in the last week.
To me, it sounds like Facebook is in big trouble. Fifteen million users have left Facebook in the past year, and many of them were younger people. This group is unlikely to change their minds about Facebook as they grow older.
It is possible that the growing lack of interest in Facebook had something to do with its phone number look up that users can not opt-out of. Or, maybe the teens have started to distrust Facebook after it was reported that the Facebook Research App was sucking up the data of teenagers.
Mark Zuckerberg wrote a lengthy post on Facebook titled: “A Privacy-Focused Vision for Social Networking.” In it, he provides some information about things Facebook will do to protect the privacy of its users.
Mark Zuckerberg acknowledged that Facebook doesn’t have a good reputation regarding privacy.
I understand that many people don’t think Facebook can or would even want to build this kind of privacy-focused platform – because frankly we don’t currently have a strong reputation for building privacy protective services, and we’ve historically focused on tools for more open sharing. But we’ve repeatedly shown that we can evolve to build the services that people really want, including in private messaging and stories.
Facebook is going to use privacy enhancing techniques that it used in WhatsApp to build a privacy-focused platform. It includes end-to-end encryption that will prevent anyone – including Facebook – from seeing what people are sharing on their services.
Facebook will no longer keep messages or stories around for longer than necessary to deliver the service or longer than people want them. Mark Zuckerberg wrote: “As we build up large collections of messages and photos over time, they can become a liability as well as an asset.” People want to know that what they share won’t come back to hurt them later.
Stories already expire after 24 hours unless you archive them. Facebook wants messages to be deleted after a month or a year by default. Users would have the ability to change the time frame or turn off auto-deletion if they want to.
There is also a plan to make Facebook Messenger, Instagram Direct, and WhatsApp interoperable. People on one service will be able to communicate with people on the other services. This apparently won’t work on iOS, but can work on Android.
Another big thing is the announcement that Facebook will not build data centers that store sensitive data in countries that have a track record of violating human rights like privacy or freedom of expression. Mark Zuckerberg acknowledges that this could mean Facebook’s services could be blocked in some countries.
Overall, this plan sounds good. Privacy is extremely important, and I like the idea Facebook will allow users to delete things and not have them stored forever. I’m going to need to see Facebook actually make those changes before I will believe that it will follow through on this plan. Actions speak louder than words.