Category Archives: Meta

Meta Adds Discord-Like Features To Facebook Groups



Meta (parent company of Facebook and Instagram) announced that they are testing new ways to quickly access your favorite Facebook Groups and to simplify how they are organized. Meta also introduced channels, which are focused spaces for people to connect in smaller, more casual settings with their communities.

Here are some of the features that Meta is adding to Facebook Groups:

Meta is testing a new sidebar that helps you easily find your favorite groups more quickly. It will list your groups and the latest activity within them, like posts or chats you haven’t seen yet. You can also pin your favorite groups so they show up first, discover new groups, or create your own group.

Within your group, you’ll see a new menu that includes things like events, shops and a variety of channels to make it easier to connect with others around the topics you care about.

Admins can create channels to connect with their groups in smaller, more casual settings where they can have deeper discussions on common interests or organize their communities around topics in different formats.

Community chat channels: a place for people to message, collaborate and form deeper relationships around topics in a more real-time way across both Facebook Groups and Messenger.

Community audio channels: a feature where admins and members can casually jump in and out of audio conversations in real time.

Community feed channels: a way for community members to connect when it’s most convenient for them. Admins can organize their communities around topics within the group for members to connect around more specific interests.

The Verge reported that the changes made by Meta to Facebook Groups looks a lot like Discord. It has a left-aligned sidebar and channels list for Groups. According to The Verge, the changes are giving off “some serious Discord vibes.” The change has a lot of purple color added to it, which evokes Discord’s look.

The Verge also pointed out that part of the new Facebook Groups includes text chats, audio rooms, and feed rooms where people can post and comment. Again, it looks a lot like Discord. Meta included images that show what Facebook Groups will look like. It just so happens to have focused on a group that is for gamers, perhaps to boost Facebook gaming.

It isn’t unheard of for social media companies to copy features that originated somewhere else. Many of them have a tendency to “copy” another social media’s “homework”, rather than creating something unique on their own platform. Personally,

In short, Meta decided to take the lazy way out and copy-paste the features it saw in Discord. It is unclear what, exactly, Meta hopes will happen next. I suppose it is possible for Discord to object to having their main features appropriated by Meta. Personally, I doubt that people will leave Discord, where their game-playing friends are at – in favor of using Meta instead.


Meta Facing Lawsuits Claiming Its Algorithms Cause Addiction



Meta (parent company of Facebook and Instagram) is facing eight lawsuits filed in courthouses across the US over the last week that allege that excessive exposure to platforms including Facebook and Instagram has led to attempted or actual suicides, eating disorders and sleeplessness, among other issues, Bloomberg reported. More specifically, the lawsuits claim that the company built algorithms into its platforms that lure young people into destructive addiction.

According to Bloomberg, one of the new suits was filed by Naomi Charles, a 22-year-old woman who says she stated using Meta platforms when she was a minor and that her addiction led to her to attempt suicide and other suffering. Naomi Charles, like other users, is seeking monetary damages to compensate for mental anguish, loss of enjoyment of life and costs of hospitalization and medical bills.

The claims in the suits include defective design, failure to warn, fraud, and negligence. The complaints were filed in federal court in Texas, Tennessee, Colorado, Delaware, Florida, Georgia, Illinois and Missouri.

NBC News reported about a separate case, filed in the Northern District of California, which was filed on behalf of Alexis Spence, who was able to create her first Instagram account at the age of 11 without her parents’ knowledge and in violation of the platform’s minimum age requirement of 13.

According to NBC News, the complaint alleges that Instagram’s artificial intelligence engine almost immediately steered the then-fifth grader into an echo chamber of content glorifying anorexia and self-cutting, and systematically fostered her addiction to using the app. The lawsuit was filed by the Social Media Victims Law Center, a Seattle-based group that advocates for families of teens harmed online.

That lawsuit is the first of its kind to draw from the Facebook Papers, while exposing the real harm behind its findings, Alexis Spence’s attorneys say. The suit also features previously unpublished documents from the leaks, including one in which Meta identified “tweens” as “herd animals” who “want to find communities where they can fit in.” The attorney’s argue that the documents demonstrate Meta’s efforts to recruit underage users to its platforms.

NBC News also reported that Tammy Rodriguez, a Connecticut woman has filed a lawsuit against Meta and Snap, the parent company of Snapchat, over the company’s alleged roles in her 11-year-old daughter’s suicide last summer.

Business Insider reported about another lawsuit, filed by a Tennessee mother who claims that her 15-year-old daughter’s heavy use of Meta’s products led her to suicidal ideation and self-harm.

According to documents seen by Business Insider, the woman’s attorney’s said the daughter received notifications from the apps all day, causing her to become addicted to the apps. She also grappled with an eating disorder, severe anxiety, depression, and poor sleep, according to the lawsuit.

A Meta spokesperson declined to comment on the litigation to Bloomberg, but noted that the company has developed tools for parents to track their children’s activity on Instagram and set time limits. Meta also offers “Take A Break” reminders that nudge users to take a moment away from social media

Personally, I find it difficult to believe that the solution is to point parents towards resources that could help them track their child’s activity on Instagram. The harm has already been done.


Australian Watchdog Group Sues Meta Over Fake Crypto Ads on Facebook



The Australian Competition & Consumer Commission (ACCC) has sued Meta over its misleading conduct for publishing scam celebrity crypto ads on Facebook. The lawsuit includes Ireland Limited (which is also part of Meta).

The ACCC alleges that Meta “engaged in false, misleading or deceptive conduct by publishing scam advertisements featuring prominent Australian public figures.” It also alleges that that Meta aided and abetted or was knowingly concerned in false or misleading conduct and representations by advertisers.

The ACCC alleges that the ads, which promoted investment in cryptocurrency or money-making schemes, were likely to mislead Facebook users into believing the advertised schemes were associated with well-known people features in the ads, such as businessman Dick Smith, TV presenter David Koch, and former NSW Premier Mike Baird. The schemes were in fact scams, and the people featured in the ads had never approved or endorsed them.

According to the ACCC: “The ads contained links that took Facebook users to a fake media article that included quotes attributed to the public figure in the ad endorsing a cryptocurrency or money-making scheme. Users were then invited to sign up and were subsequently called by scammers who used high pressure tactics, such as repeated phone calls, to convince users to deposit funds into the fake schemes.”

Reuters reported a quote from ACCC Chair Rod Sims, who said: “The essence of our case is that Meta is responsible for these ads that it publishes on its platform. It is alleged that Meta was aware… scam ads were being displayed on Facebook but did not take sufficient steps to address the issue.”

The Guardian reported: The scam has likely raked in millions from unsuspecting people. One 77-year-old grandmother lost $80,000 in the investment, while the ACCC has said another person lost $650,000 through the scam.

The Sydney Morning Herald posted a response from a Meta company spokesman, who said the company did not want ads seeking to scam people out of money or mislead people on Facebook.

Personally, I do not believe the statement the Meta spokesperson gave. Meta is a huge company, and if it truly wanted to protect users from being harmed by fake crypto ads, it should have immediately acted to remove them. Meta left those ads up.


Meta Backtracks On Allowing Violent Threats to Russian Soldiers



CNBC reported on March 14, 2022, that Meta has backtracked on their terrible decision. According to CNBC, Meta Platforms clarified that users cannot make posts calling for the assassination of Russia’s president Vladimir Putin or other heads of state.

Meta (parent company of Facebook) also said that a previously reported temporary easing of its hate speech policy now only applies to allowing posts by users in Ukraine. Originally, it allowed temporary easing of hate speech restrictions to several other countries.

CNBC also reported about an internal post on Sunday, written by Meta President of Global Affairs Nick Clegg. He wrote that the company is “now narrowing its focus to make explicitly clear in the guidance that it is never to be interpreted as condoning violence against Russian’s in general.” Nick Clegg added, “We do not permit calls to assassinate a head of state.”

The recent statements from Nick Clegg contradict what has previously been reported by Reuters. Recently, Meta chose to allow Facebook and Instagram users in some countries to call for violence against Russians and Russian soldiers in the context of the Ukraine invasion. Meta even gave users a template sentence to use: ‘death to the Russian invaders’.

Reuters reported that Meta was also allowing some users to post calls for death to Russian President Putin or Belarusian President Alexander Lukashenko (according to internal emails to its content moderators).

Meta also has another significant problem. Nick Clegg tweeted: “Responding to reports that the Russian government is considering designating Meta as an extremist organization for its policies in support of speech:” The tweet includes a screenshot of a letter-length statement from Nick Clegg. In my opinion, feels like a desperate attempt to convince people that Meta didn’t mean what it said regarding its own hate speech policy.

The tweet was posted after Reuters reported that Russian prosecutors asked a court to designate Meta Platforms as an “extremist organization,” and the communications regulator said it would restrict access to Meta’s Instagram starting March 14. (Russia had previously blocked Facebook).


Russia Asks Court to Declare Meta an “Extremist Organization”



Yesterday, Meta Platforms (the parent company of Facebook) chose to allow Facebook and Instagram users in some countries to call for violence against Russian and Russian soldiers in the context of the Ukraine invasion. Meta even provided an example sentence that it would allow: “death to the Russian invaders”. Reuters reported Meta was temporarily allowing users to post calls for death to Russian President Vladimir Putin or Belarusian President Alexander Lukashenko.

At the time, the temporary policy (that allowed some users to break Meta’s hate speech policy) was for users in Armenia, Azerbaijan, Estonia, Georgia, Hungary, Latvia, Lithuania, Poland, Romania, Russia, Slovakia, and Ukraine.

Today, Reuters reported that Russian prosecutors asked a court to designate Meta Platforms as an “extremist organization,” and the communications regulator said it would restrict access to Meta’s Instagram starting March 14. The company said the decision would affect 80 million users in Russia.

The Verge reported that one week after placing a ban on Facebook in Russia, the country’s communication agency Roskomandzor announced it will ban Instagram, too.

According to The Verge, the Facebook ban cited “discrimination against Russian media”. The Instagram ban is happening because of a decision by parent company Meta directing moderators to allow posts calling for violence against Russian soldiers if they originate from certain countries, including Ukraine.

Interestingly, Russia decided not to block WhatsApp (which is also owned by Meta). Reuters reported that Russia’s RIA News agency cited a source saying the messaging app is considered a means of communication, not a way to post information.

Personally, I am not at all surprised that Meta’s decision to temporarily ignore its policy against hate speech and violence is being remarked upon by Russia. It is never a good idea to encourage anyone to post hate speech and/or violence online, and Meta is a large enough company that it should have known better.


Meta Allows Ukraine War Posts Urging Violence Against Invading Russians



Reuters reported that Meta Platforms (the parent company of Facebook) will allow Facebook and Instagram users in some countries to call for violence against Russians and Russian soldiers in the context of the Ukraine invasion. Reuters clarifies that this is a temporary change to Meta’s hate speech policy.

According to Reuters, the social media company is also temporarily allowing some posts that call for death to Russian President Vladimir Putin or Belarusian President Alexander Lukashenko, according to internal emails to its content moderators.

A Meta spokesperson gave the following statement to Reuters:

“As a result of the Russian invasion of Ukraine we have temporarily made allowances for forms of political expression that would normally violate our rules like violent speech such as ‘death to the Russian invaders.’ We still won’t allow credible calls for violence against Russian civilians.”

It sound like Meta has provided a template sentence for people to use without facing any consequences.

Reuters reported that the calls for leaders’ deaths will be allowed unless they contain other targets or have two indicators of credibility, such as the location or method. This temporary policy change on calls for violence to Russian soldiers apply to Armenia, Azerbaijan, Estonia, Georgia, Hungary, Latvia, Lithuania, Poland, Romania, Russia, Slovakia, and Ukraine.

On February 28, President of Global Affairs at Meta, tweeted: “We have received from a number of Governments and the EU to take further steps in retaliation to Russian state controlled media. Given the exceptional nature of the current situation we will be restricting access to RT and Sputnik across the EU at this time.”

On March 3, Meta announced that they were committing $15 million to support humanitarian efforts in Ukraine and neighboring countries. It includes $5 million in direct donations to UN agencies and more than a dozen nonprofits, including International Medical Corps who will be using these funds to deploy mobile medical units to Ukraine and Internews to support at-risk journalists and human rights defenders. They are also donating to UNICEF to support children and families in Ukraine.

It would have been better if Meta focused on those two things, and stopped there.

What will Meta do when, sometime in the future, another war starts? Will their hate speech policy be temporarily ignored again? Meta cannot offer a healthy community while it is looking the other way when people post death threats.


Meta Introduces a Personal Boundary for Horizon Worlds



Meta announced that they are adding a Personal Boundary for Horizon Worlds and Horizon Venues. The Personal Boundary prevents avatars from coming within a set distance of each other, creating more personal space for people and making it easier to avoid unwanted interactions.

A Personal Boundary prevents anyone from invading your avatar’s personal space. If someone tries to enter your Personal Boundary, the system will halt their forward movement as they reach the boundary. You won’t feel it – there is no haptic feedback. This builds upon our existing hand harassment measures that were already in place, where an avatar’s hands would disappear if they encroached upon someone’s personal space.

According to Meta, the Personal Boundary is always on, by default. The reason is because Meta wants to use it to help set behavioral norms in virtual spaces. That said, it is possible for two people to willingly choose to extend their arms outside of their Personal Boundaries in order to do a high-five or a fist bump.

The Verge provided some background that could have been the reason why Meta added the Personal Boundary. According to The Verge, when Horizon Worlds was in beta testing, at least one beta user complained that her avatar had been groped by a stranger. The Personal Bubble gives everyone a two-foot radius of virtual personal space, equating the equivalent of four virtual feet between the avatars.

Personally, I think that Meta is doing the right thing by instituting a Personal Boundary. It shows that they understand that some of the people who want to experience Horizon Worlds or Horizon Venues will choose to behave badly towards others. The Personal Boundary makes it impossible for them attempt to grab, grope, or sexually assault someone in the VR setting.