Facebook Gives You More Control Over What You See

Facebook Newsfeed PreferencesFacebook has made some changes that will let you improve your News Feed experience. Surprisingly, it is going to let people select what they want to see first. Pick the friends who create posts that you actually want to see.

Product Manager Jacob Frantz wrote a post on Facebook Newsroom about this change. Part of it says:

We’re always working to improve and personalize your News Feed experience. We know that ultimately you’re the only one who truly knows what is most meaningful to you and that is why we want to give you more ways to control what you see.

There is now an option to “Prioritize who to see first”. This gives you the opportunity to put the people, or pages, that are most important to you at the top of your News Feed. Who are you hoping to read posts from when you go on Facebook? Those are the people you should prioritize. No more scrolling through a bunch of stuff you don’t care about before finding what you came there to see.

The same post by Jacob Frantz mentions the Unfollow option. It is as though he is reminding users of that option (since it isn’t new). Use the Unfollow on that person who keeps posting political articles that you are tired of seeing. You can Follow them again later, after they calm down. To do that, just select them from your list of people you have Unfollowed.

The new ability to prioritize your Facebook feed is available on iOS and will be rolling out on Android and desktop over the coming weeks.

I find these changes interesting, even though I don’t use Facebook myself. I cannot help but wonder if the ability to pick what you want to see first, and to Unfollow people – without Unfriending them, or giving them any way to know that you have stopped Following them – is a form of triage.

Giving people more control over what they see could make a person’s Facebook experience more pleasant and less aggravating. It might be what prevents people from getting tired of, or frustrated with, Facebook and quitting it forever.

Is Your Facebook Feed an Echo Chamber?

facebook-logoFacebook recently did some research in order to discover exactly how much individuals could be, and are, exposed to ideologically diverse news and information in social media. People are increasingly turning to social media for news. Is it your selection of friends, or Facebook’s algorithms, that have the most influence on what you see in your News Feed?

The Facebook researchers looked at individuals who use Facebook and who self-identified as either a liberal or as a conservative. They found that 9% of Facebook users in the United States classified themselves as either a liberal or a conservative.

The researchers wanted to find out how much people were being exposed to “hard news” (articles about politics, world affairs, and the economy), rather than “soft news” (stories about entertainment, celebrities and sports). They also wanted to know whether the information in the articles were aligned primarily with liberal or conservative audiences.

The researchers found that, on average, 23% of people’s friends claim an opposing political ideology. They found that 29% of the hard news content that people’s friends share cuts across ideological lines. It turned out that 28.9% of the hard news that Facebook users saw in their News Feed cut across ideological lines. The researchers also found that 24.9% of the hard news content people actually clicked on cut across ideological lines.

What does all this mean? Facebook says that the composition of a person’s social network is the most important factor affecting the mix of content encountered on social media. Individual choice also plays a large role. Facebook says the News Feed ranking has a smaller impact on the diversity of information a person sees from the opposing ideological viewpoint than does who they have selected as friends.

In other words, Facebook says that the friends you choose have more of an influence on what you see on Facebook than does the News Feed algorithm. You could be, intentionally or unwittingly, creating an echo chamber by only friending people who match your ideological viewpoint.

On the other hand, there’s an interesting article on Medium that takes a look at Facebook’s study. Eli Pariser points out that the Facebook research was done on just 9% of Facebook users (a small number of overall users), and that those users could behave differently on Facebook than people who don’t identify themselves as either liberal or conservative. He also notes that since this was done by Facebook scientists, the study is not reproducible – at least, not without Facebook’s permission to reproduce it.

Facebook’s Messenger Platform Adds New Ways to Connect With Friends & Businesses

facebook f8One of the main announcements from Facebook’s F8 Conference yesterday was the launch of Facebook’s new Messenger Platform, which will open the Messenger SDK to third-party apps and services, as well as a new customer service initiative that allows customers connect with businesses directly over Facebook Messenger.

Users will be able to access third-party Messenger content apps via a button next to the options for adding photos or stickers. Using these apps, users can create custom GIFs, videos, and other personalized content and send their creations to friends and family through Messenger. The recipient will receive a link to download or open the third-party app to view and respond to the message with their own content. Facebook’s David Marcus emphasized that the content apps will not be included within Facebook Messenger, but will remain standalone apps that link to Messenger:

“If we added a 10th of the capabilities [directly to Messenger] that we’ve added with partners today, it would make it really slow… If you don’t want to use those things, you’re not forced to… because those experiences don’t live inside of Messenger. It’s not like the overall experience of the app is getting very bloated.”

The Messenger platform will launch with support for a wide range of popular apps, including ESPN, Bitmoji, JibJab, Legend, Ultratest, Ditty, Giphy, FlipLip, ClipDis, Memes, PicCollage, Kanvas, Action Mobile FX, Boostr, Camoji, Cleo Video Texting, Clips, Dubsmash, Effectify, EmotionAR, EMU, Fotor, Gif Keyboard, GifJam, Hook’d, Imgur, Imoji, Keek, Magisto, Meme Generator, Noah Camera, Pic Stitch, PingTank, Score on Friends, Selfied, Shout, StayFilm, Facebook Stickered, Strobe, Tackl, Talking Tom, Tempo, The Weather Channel, to.be Camera, and Wordeo. More apps will undoubtedly follow as developers have a chance to experiment with the SDK.

Facebook also hopes to improve interaction between customers and businesses by bringing the intuitive Messenger experience to the marketplace. Facebook has partnered with businesses like Zulily and Everlane to allow customers to cancel, modify, and track orders directly within Messenger. Integration with ZenDesk and other customer service platforms will let businesses to respond to customer inquiries through Messenger as well. In addition, businesses will be able to send push notifications to consumers’ devices, even if the business itself does not have its own app.

Although no monetization strategy has been announced for Messenger’s new business features, it’s likely that the information Facebook gleans from these customer-to-business interactions, such as where users shop and what products they buy, will allow for improved ad targeting, perhaps leading to more profits from Facebook’s ad platform down the line.

All in all, the improvements to Messenger seem to confirm that Facebook is continuing its efforts to differentiate from the competition and protect its substantial market share. The emphasis they’re placing on keeping third-party functionality separate from the core Messenger app also coincides with Facebook’s gradual transition from an all-in-one app to multiple apps for specific functionalities.

What do you think of Facebook’s F8 announcements? Which of Messenger’s new features are you excited to try out?

Facebook to Add Suicide Reporting

6a00d8341c007953ef0162fdd028f7970d-800wi

Let’s face it, social media has become the main way we communicate with each other today. We have all seen posts from friends in good times but also in bad. When it comes to bad, it’s hard to say how bad things are, but places like Facebook can often be where troubled people reach out. Now Facebook intends to do something to help.

Over the next few weeks Facebook will be rolling out a new tool to allow users to flag posts made by someone they are concerned about.

The tool will allow users to report someone they think might be suicidal or hurt themselves in some way.  After flagging the post, you will be given information on how to deal with the person, or let a trained team at Facebook get in touch with the person. You will be given the option to send message to the person, or a mutual friend or even talk to a Facebook team member to advise you on what to do next.

CNET has a full gallery walkthrough of how to use the tool.

Atlas Lets Advertisers Track you Online and Offline

Atlas by Facebook logoThere is an old saying that goes something like “You aren’t paranoid if they really are out to get you.” Many people have expressed concern about the amount of information that Facebook has and whom they might share it with. Now that Facebook has launched Atlas, it is clear that your information really is being given to corporations.

Facebook just announced that they have launched Atlas. They wrote: “We’ve rebuilt Atlas from the ground up to tackle today’s marketing challenges, like reaching real people across devices and bridging the gap between online impressions and offline purchases”.

Facebook then points people toward the Atlas blog The blog post discusses something called “people-based marketing”, which is described as “helping marketers reach real people across devices, platforms, and publishers”.

In short, Atlas is going to enable advertisers to track people across the internet from one device to the next and across platforms. A unique feature of Atlas is its ability to track not only what ads a person sees online, but also to bridge the gap between online and offline Atlas is going to connect offline purchases – that’s right, purchase not made via the internet – with the ads that a person viewed.

The purpose, of course, is to help companies to find out how well their ads are doing. It’s all about helping big companies make more money. There isn’t anything about Atlas that benefits real people. Instead, it invades the privacy of people who happen to use Facebook by letting companies track not only what ads the person saw online but also the things that person later went out into the real world to purchase.

In addition to Facebook, the Atlas blog says that Instagram is also a “publisher”. That means it is “now enabled to measure and verify ad impressions”. Atlas is looking for more companies to become partners with them right now. You can find a list of the current companies that have partnered with Atlas on their blog.

Could You Quit Facebook for 99 Days?

99 Days of Freedom logoCould you go 99 days without Facebook? It is a question that is worth asking yourself, especially if you are someone who checks into Facebook several times a day. A group called Just wants to encourage people to give up Facebook for 99 days. They are calling this effort 99 Days of Freedom.

What would your life be like without Facebook? Would you feel uncomfortable about not visiting the popular social media website every day? Would you miss it? Maybe your life is so busy that you wouldn’t really notice the absence of Facebook. That might be true for those who use Facebook infrequently.

There is a bigger question to consider. Would you be happier without Facebook? That is the question that Just is focused on. Just launched this experiment in response to Facebook’s controversial mood experiment. Unlike Facebook, Just is not interested in manipulating your mood. Instead, they are interested in determining how life without Facebook impacts user’s happiness.

Joining the “99 Days of Freedom” experiment is easy. Change your profile picture on Facebook to the icon you see at the top of this blog. Share your last link. Don’t use Facebook for 99 days. That means no logging in, no messenger, and no sharing.

Just will contact you after day 33, 66, and 99 to see how you are doing. Give Just your email address if you would like to join their happiness survey. You can put a countdown on your Facebook page to let your friends know when you will return (as well as why you are taking a break).

The selection of 99 days was intentional. Just feels that participants would lose interest in the experiment if it ran longer than 99 days. They also felt that a smaller number of days would make it harder to assess behavioral change.

To be clear, this experiment is not a protest against Facebook. Instead, it is viewed by Just as a way for people to experience the emotional benefits of moderation. Those who take part will help Just discover if people truly are happier without having Facebook in their lives.

Were you Part of Facebook’s Psychology Experiment?

FacebookMuch has been said about how Facebook utilizes the information that its users choose to post. There have been many blogs regarding privacy issues (especially when Facebook makes changes to it). People are aware that their photos or posts could be included in Facebook advertising. Were you aware that Facebook can also use your data for psychology experiments?

Scientists at Facebook published a paper that appears in the current issue of The Proceedings of the National Academy of Sciences of the United States of America. The paper is titled “Experimental evidence of massive-scale emotional contagion through social networks”.

The psychological experiment on Facebook took place for 1 week (January 11 – 18, 2012). Participants were randomly selected based on their User ID. There were about 155,000 participants who posted at least one status update during the experimental period.

The experiment manipulated the extent to which people were exposed to emotional content in their News Feed. The scientists were looking for something they refer to as “emotional contagion”. By this, they meant that they were watching for signs that emotional states can be transferred from one person to another without direct interaction between people (and in absence of nonverbal cues). What they discovered is that “emotional contagion” really can happen. From the abstract:

When positive expressions were reduced, people produced fewer positive posts and more negative posts. When negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.

It is a very interesting finding. Unfortunately, it was discovered as a result of scientists secretly manipulating some Facebook user’s emotions by tweaking whether they were shown positive or negative posts during the experimental period. It feels like a really horrible thing to do to random people who have no idea they were being used as a “guinea pig” in a psychological experiment.

If you are on Facebook, then you have agreed to be part of experiments like this one when you clicked that you agree to the Facebook Data Use Policy. Part of it says that potential uses of your data include “internal operations, including troubleshooting, data analysis, testing, research, and service improvement”.

The scientists stayed within those boundaries to do the experiment. They used machine analysis to select positive and negative posts. This enabled the experiment to be done without having human researchers read user data that contained personal information. I cannot help but wonder how many other psychological experiments have happened on Facebook (or if more will happen in the future).

Facebook had an Outage

FacebookFacebook had a temporary outage that affected all traffic from the internet and apps to the social network. The popular social network was down for somewhere between ten minutes and about half an hour or so (depending upon which news resource you read and which country it was located in). At the time I am writing this blog, Facebook has returned to its usual service.

The Guardian reported that this was the longest outage that Facebook has had in four years. It also shared that there was a noticeable drop in the amount of Facebook referrals to The Guardian while Facebook was out of service.

About 1.28 billion users were suddenly unable to access Facebook while it was having the outage. TechCrunch reports that Facebook was unavailable in multiple regions around the world. This included the UK, France, Belgium, and parts of Asia (including India).

Personally, I noticed that my friends who live in Australia were posting Tweets in which they wondered why Facebook was down. It was rather amusing to see people from all over the world turn to Twitter to complain (and make jokes) about Facebook’s outage.

TechCrunch also reported that the outage affected not only the Facebook website and its smartphone and tablet apps but also some Facebook plug-ins that were attached to other websites. Those of you who use Facebook and have connected it to other websites may want to check and see how you were affected by the outage. Or, you may want to check your stats to see how Facebook’s outage affected traffic to your website.

What happened that caused Facebook to have an outage? That hasn’t been revealed. Several websites (including The Guardian) posted a statement that came from Facebook. It said:

Earlier this morning, we experienced an issue that prevented people from posting to Facebook for a brief period of time. We resolved the issue quickly, and we are now back to 100%. We’re sorry for any inconvenience this may have caused.

Facebook Rolls Out Expanded Privacy Checkup Tool

FacebookHave you ever worried about accidentally sharing too much on Facebook? I don’t mean the concern that it might be inappropriate to share all the details of your hospital stay. Instead, I mean the fear that your post could be read by people you never intended to see it. Facebook is aware of these concerns.

A blog post on Facebook Newsroom gives details about a change to privacy settings that is being rolled out. Current users of Facebook will soon see a “Privacy Checkup” pop-up appear when they make a post.

“We just wanted to make sure you’re sharing with the right people”, it helpfully states. It offers a brief tutorial about each privacy setting, making it easier for users to select how private they want an individual post to be. There will be more “Privacy Checkup” pop-ups later on, if it has been a while since a user has changed who can see his or her posts.

As of May 22, 2014, when a new person joins Facebook, the default audience of their first post will be set to “Friends”. Previous to this change, new users of Facebook had their default audience set to “Public”.

In addition to that, new Facebook users will get an automatic “reminder” that appears when they make their first post. It points to the privacy setting button that is attached to each post and asks “Who would you like to see your post?” If the person chooses to ignore that popup, their post will automatically be set to “Friends”.

Overall, these changes could help prevent Facebook users from embarrassing themselves by posting something publicly that was intended to only be seen by their friends. This change is very similar to one that took effect in October of 2013 that changed the default privacy setting on the posts on Facebook accounts of teens (age 13 through 17). It went from having the default privacy setting be “Friends of Friends” to “Friends” only.

Groups Ask the FTC to Investigate the WhatsApp Deal

WhatsApp logoThe Electronic Privacy Information Center and the Center for Digital Democracy are asking the Federal Trade Commission (FTC) to investigate how the WhatsApp deal will impact the privacy of its users. Facebook acquired WhatsApp just a few weeks ago.

The concern is that Facebook will use the personal information of WhatsApp’s more than 450 million users to target advertising. Those who started using WhatsApp before it was acquired by Facebook were told that WhatsApp would not collect user data for advertising revenue. The complaint states:

Facebook routinely makes use of user information for advertising purposes and has made clear that it intends to incorporate the data of WhatsApp users into the user profiling business model. The proposed acquisition will therefore violate WhatsApp users’ understanding of their exposure to online advertising and constitutes an unfair and deceptive trade practice, subject to investigation by the Federal Trade Commission.

On June 18, 2012, WhatsApp posted a blog titled “Why we don’t sell ads”. Perhaps the key point is this sentence: “Remember, when advertising is involved you the user are the product.”

WhatsApp also posted a blog on February 19, 2014, titled “Facebook”. It is about the acquisition. The key point from that blog might be this sentence: “Here’s what will change for you, our users: nothing.” The blog promises that users can still count on absolutely no ads interrupting their communication through WhatsApp. Facebook has issued a statement indicating that they will honor WhatsApps commitments to privacy and security.

This situation reminds me of some words of wisdom that gets passed around. You cannot be certain that anything posted on “the internet” (on a blog, in a chat, or through social media) will be kept private forever. That being said, I can understand why users of WhatsApp feel betrayed. WhatsApp promised not to sell their data for adverting purposes. Will Facebook keep that promise? It will be very interesting to see what the FTC thinks about this situation.