Tag Archives: Meta

Instagram Introduces New Ways to Verify Age on Instagram



Instagram announced that they are testing new options for people on Instagram to verify their age, starting with people based in the U.S.

If someone attempts to edit their date of birth on Instagram from under the age of 18 to 18 or over, Instagram will require them to verify their age using one of three options: upload their ID, record a video selfie or ask mutual friends to verify their age. Instagram is testing this out so they can make sure teens and adults are in the right experience for their age group. Instagram is also partnering with Yoti, a company that specializes in online age verification, to help ensure people’s privacy.

Here is more information about verifying age:

In addition to having someone upload their ID, Instagram is testing two new ways to verify a person’s age:

Video Selfie: You can choose to upload a video selfie to verify your age. If you choose this option, you’ll see instructions on your screen to guide you. After you take a video selfie, Instagram will share the image with Yoti, and nothing else. Yoti’s technology estimates your age based on your facial features and shares that estimate with Instagram. Meta and Yoti then delete the image. The technology cannot recognize your identity just your age.

Social Vouching: This option allows you to ask mutual followers to confirm how old you are. The person vouching must be at least 18 years old, must not be vouching for anyone else at that time, and will need to meet other safeguards Instagram has in place. The three people you select to vouch for you will receive a request to confirm your age and will need to respond within three days.

Instagram points out that you will still be able to upload your ID to verify your age with forms of identification like a driver’s license or ID card. They will use your ID to confirm your age and help keep their community safe. Your ID will be stored securely on Instagram’s servers and is deleted within 30 days.

The Wall Street Journal reported that Instagram is adding these extra steps as part of its efforts to ensure an “age-appropriate” experience for minors. While children under 13 are prohibited by the network’s terms of service, those who say they are ages 13 to 17 can use it with some limitations.

According to The Wall Street Journal, Instagram doesn’t verify the age a user declares when creating an account, and Instagram said these new tools won’t change that.

TechCrunch reported that there are two basic use cases for Instagram’s new verification system: adults who have registered as teens by mistake and trying to enter their correct age: and teens who are trying to circumvent the platform’s age-appropriate restrictions.

Personally, I think that one of the reasons why Instagram is announcing this new age-check system may have something to do with the lawsuits that Meta (parent company of Facebook and Instagram) is facing. In short, some have claimed in their lawsuits that Instagram includes defective design, failure to warn, fraud, and negligence.

Some of the lawsuits are from people who are now adults who claim they were harmed by Instagram. Others are parents of tweens or teens who experienced suicidal ideation or self-harm after using Instagram.


WhatsApp Now Lets You Transfer Your Chat History From Android to iPhone



Today, Mark Zuckerberg wrote on Facebook, “We’re adding to WhatsApp the ability to securely switch between phones and transfer your chat history, photos, and voice messages between Android and iPhone while maintaining end-to-end encryption. This is a top requested feature. We launched the ability to switch from iPhone —-> iPhone as well.”

It appears that this was first spotted by the WABBetaInfo website. It provided information for people who want to migrate their chat history from Android to iOS. The first thing to know is that you need to have at least Android 5 installed on your Android device and iOS 15.5 on your iPhone.WABBetaInfo says that since iOS 16 is a beta version, it is not guaranteed that it will work since WhatsApp does not provide support for beta versions of iOS.

When you transfer your chat history across different platforms, WhatsApp is not able to see the data you transfer. In addition, you need to manually enable the end-to-end encrypted backup option right within WhatsApp for iPhone if you want, even if you already enabled encrypted backups on WhatsApp for Android.

The Verge points out that if you already have a preexisting iOS chat history, then the imported Android history will overwrite it. That’s definitely something to consider if you are someone who likes to save your chats.

Engadget reported that the WhatsApp feature will help you move your content over from Android to iOS. It will be part of Apple’s existing “Move to iOS” tool. To be clear, WhatsApp’s feature is available as a beta for now, so you may encounter bugs during the transfer process.

According to Engadget, when you select WhatsApp, it will open automatically and prompt you to give permission to move your data over to iOS. Depending on the amount of content you have, it’ll take awhile to package everything up and transfer it to your iPhone. Apple will also pre-load the WhatsApp icon on your home page so you can just tap it to finish installing it on your new iPhone, instead of having to go through the App Store.

TechCrunch reported that the process you use to transfer to iOS also can be used to transfer your account information, profile picture, individual chats, group chats, chat history, media and settings. However, you can’t transfer your call history or display name.

Overall, it sounds to me like the ability to transfer your WhatsApp information from Android to iOS could be enticing for people who were already thinking about getting an iPhone. One thing to keep in mind is that Meta (parent company of Facebook and Instagram) owns WhatsApp.


Meta Facing Lawsuits Claiming Its Algorithms Cause Addiction



Meta (parent company of Facebook and Instagram) is facing eight lawsuits filed in courthouses across the US over the last week that allege that excessive exposure to platforms including Facebook and Instagram has led to attempted or actual suicides, eating disorders and sleeplessness, among other issues, Bloomberg reported. More specifically, the lawsuits claim that the company built algorithms into its platforms that lure young people into destructive addiction.

According to Bloomberg, one of the new suits was filed by Naomi Charles, a 22-year-old woman who says she stated using Meta platforms when she was a minor and that her addiction led to her to attempt suicide and other suffering. Naomi Charles, like other users, is seeking monetary damages to compensate for mental anguish, loss of enjoyment of life and costs of hospitalization and medical bills.

The claims in the suits include defective design, failure to warn, fraud, and negligence. The complaints were filed in federal court in Texas, Tennessee, Colorado, Delaware, Florida, Georgia, Illinois and Missouri.

NBC News reported about a separate case, filed in the Northern District of California, which was filed on behalf of Alexis Spence, who was able to create her first Instagram account at the age of 11 without her parents’ knowledge and in violation of the platform’s minimum age requirement of 13.

According to NBC News, the complaint alleges that Instagram’s artificial intelligence engine almost immediately steered the then-fifth grader into an echo chamber of content glorifying anorexia and self-cutting, and systematically fostered her addiction to using the app. The lawsuit was filed by the Social Media Victims Law Center, a Seattle-based group that advocates for families of teens harmed online.

That lawsuit is the first of its kind to draw from the Facebook Papers, while exposing the real harm behind its findings, Alexis Spence’s attorneys say. The suit also features previously unpublished documents from the leaks, including one in which Meta identified “tweens” as “herd animals” who “want to find communities where they can fit in.” The attorney’s argue that the documents demonstrate Meta’s efforts to recruit underage users to its platforms.

NBC News also reported that Tammy Rodriguez, a Connecticut woman has filed a lawsuit against Meta and Snap, the parent company of Snapchat, over the company’s alleged roles in her 11-year-old daughter’s suicide last summer.

Business Insider reported about another lawsuit, filed by a Tennessee mother who claims that her 15-year-old daughter’s heavy use of Meta’s products led her to suicidal ideation and self-harm.

According to documents seen by Business Insider, the woman’s attorney’s said the daughter received notifications from the apps all day, causing her to become addicted to the apps. She also grappled with an eating disorder, severe anxiety, depression, and poor sleep, according to the lawsuit.

A Meta spokesperson declined to comment on the litigation to Bloomberg, but noted that the company has developed tools for parents to track their children’s activity on Instagram and set time limits. Meta also offers “Take A Break” reminders that nudge users to take a moment away from social media

Personally, I find it difficult to believe that the solution is to point parents towards resources that could help them track their child’s activity on Instagram. The harm has already been done.


Instagram Changes Its Ranking System to Highlight Original Content



Adam Mosseri, Head of Instagram, tweeted about new features. “We’ve added new ways to tag and improve ranking: Product Tags, Enhanced Tags, Ranking for originality. Creators are so important to the future of Instagram, and we want to make sure that they are successful and get all the credit they deserve.”

TechCrunch reported that shortly after that announcement, a spokesperson from Instagram sent an email saying that Instagram is making changes to its ranking algorithm to prioritize the distribution of original content, rather than reposted content, in places like the Reels tab and feed.

The Verge reported that product tags are now available to everyone on Instagram, and you can assign yourself to a category like “Photographer” or “Rapper” and have that category show up every time you’re tagged in a post. Instagram is also going to start more heavily promoting original content on the platform.

The Verge also suggested that this is Instagram’s way of saying “Please, please, please stop just posting your favorite TikTok’s to Reels. We’re begging you.”

Engadget reported that the move to prioritize original content comes as Instagram has taken other steps to incentivize creators to post original content on its platform first, rather than re-sharing clips from TikTok and other apps. According to Engadget, the change seems to be geared toward discouraging accounts that simply aggregate and distribute popular memes and other re-posted content.

In addition, Engadget pointed out that those who don’t like Instagram’s ranked feed have an alternative now. Instagram brought back its chronological feed, but it is not enabled by default.

This news comes at a really good time for me, personally. I was in the process of deleting my Instagram account, photo by photo. The process is tedious and time consuming, and you can only delete one photo (or video) at a time. I noticed I had a lot of art on there and decided to make my account a showcase for my art and changed the name on my account to reflect that.

People who create original content and post it on Instagram should get credit for their work. It has always bothered me when accounts on social media content-scrape other people’s original content and try to pass it off as their own. I am happily surprised that Instagram is going in a direction that protects artists and their content.


Facebook Wants You To Share Reels From Third-Party Apps



Meta (parent company of Facebook) has introducedSharing to Reels. It is described on the Meta for Developers site as “a new way for developers to make it easy for people to share video directly to Facebook”.

Enabling Sharing to Reels makes it easy for people to share short-form videos directly to Facebook. Once integrated, third-party apps will have a Reels button so people can share short videos, then customize with Reels editing tools like audio, text, effects, captions and stickers. Instead of downloading their video content and uploading it later, they can now create and share video seamlessly with one tap.

At launch, Facebook has partnered with Smule, Vita, and VivaVideo who have integrated #SharingToReels and are finding new ways for Creators to express themselves, grow their communities, and reach new audiences.

Personally, I’ve never heard of those companies. I’m also wondering why Facebook didn’t choose to include Instagram which has its own version of Reels (and is also connected to Meta). That seems like the obvious choice!

TechCrunch reported: While Reels first began as a way to directly combat TikTok with a feature inside the Instagram app, Meta also brought them to Facebook shortly after. The company touted during its Q4 2021 earnings that Reels is now its “fastest-growing content format by far.” The company also said Reels was the biggest contributor to growth on Instagram and “growing very quickly” on Facebook, too.

Facebook also did not mention TikTok, which is pretty much all about reels. Why? Engadget may have the answer to that question.

Engadget reported: Facebook is taking another step to encourage users to create original content for its TikTok clone. The company introduced a “sharing to Reels” feature to allow users of third-party apps to post directly to Facebook Reels.

Engadget also reported: Now, with Facebook losing users to TikTok, Meta CEO Mark Zuckerberg has staked a lot on the success of Reels. He said last fall that Reels would be “as important for our products as Stories” and that reorienting its service to appeal to younger users was the company’s “North Star”.

In short, Facebook made a clone that does what TikTok and Instagram have already been doing. Cloning features from other social media platforms is not new. If Facebook excludes TikTok and/or Instagram from Reels, Facebook users might simply decide to continue posting their content on either Instagram or TikTok instead bringing it to Facebook.


Australian Watchdog Group Sues Meta Over Fake Crypto Ads on Facebook



The Australian Competition & Consumer Commission (ACCC) has sued Meta over its misleading conduct for publishing scam celebrity crypto ads on Facebook. The lawsuit includes Ireland Limited (which is also part of Meta).

The ACCC alleges that Meta “engaged in false, misleading or deceptive conduct by publishing scam advertisements featuring prominent Australian public figures.” It also alleges that that Meta aided and abetted or was knowingly concerned in false or misleading conduct and representations by advertisers.

The ACCC alleges that the ads, which promoted investment in cryptocurrency or money-making schemes, were likely to mislead Facebook users into believing the advertised schemes were associated with well-known people features in the ads, such as businessman Dick Smith, TV presenter David Koch, and former NSW Premier Mike Baird. The schemes were in fact scams, and the people featured in the ads had never approved or endorsed them.

According to the ACCC: “The ads contained links that took Facebook users to a fake media article that included quotes attributed to the public figure in the ad endorsing a cryptocurrency or money-making scheme. Users were then invited to sign up and were subsequently called by scammers who used high pressure tactics, such as repeated phone calls, to convince users to deposit funds into the fake schemes.”

Reuters reported a quote from ACCC Chair Rod Sims, who said: “The essence of our case is that Meta is responsible for these ads that it publishes on its platform. It is alleged that Meta was aware… scam ads were being displayed on Facebook but did not take sufficient steps to address the issue.”

The Guardian reported: The scam has likely raked in millions from unsuspecting people. One 77-year-old grandmother lost $80,000 in the investment, while the ACCC has said another person lost $650,000 through the scam.

The Sydney Morning Herald posted a response from a Meta company spokesman, who said the company did not want ads seeking to scam people out of money or mislead people on Facebook.

Personally, I do not believe the statement the Meta spokesperson gave. Meta is a huge company, and if it truly wanted to protect users from being harmed by fake crypto ads, it should have immediately acted to remove them. Meta left those ads up.


Meta Backtracks On Allowing Violent Threats to Russian Soldiers



CNBC reported on March 14, 2022, that Meta has backtracked on their terrible decision. According to CNBC, Meta Platforms clarified that users cannot make posts calling for the assassination of Russia’s president Vladimir Putin or other heads of state.

Meta (parent company of Facebook) also said that a previously reported temporary easing of its hate speech policy now only applies to allowing posts by users in Ukraine. Originally, it allowed temporary easing of hate speech restrictions to several other countries.

CNBC also reported about an internal post on Sunday, written by Meta President of Global Affairs Nick Clegg. He wrote that the company is “now narrowing its focus to make explicitly clear in the guidance that it is never to be interpreted as condoning violence against Russian’s in general.” Nick Clegg added, “We do not permit calls to assassinate a head of state.”

The recent statements from Nick Clegg contradict what has previously been reported by Reuters. Recently, Meta chose to allow Facebook and Instagram users in some countries to call for violence against Russians and Russian soldiers in the context of the Ukraine invasion. Meta even gave users a template sentence to use: ‘death to the Russian invaders’.

Reuters reported that Meta was also allowing some users to post calls for death to Russian President Putin or Belarusian President Alexander Lukashenko (according to internal emails to its content moderators).

Meta also has another significant problem. Nick Clegg tweeted: “Responding to reports that the Russian government is considering designating Meta as an extremist organization for its policies in support of speech:” The tweet includes a screenshot of a letter-length statement from Nick Clegg. In my opinion, feels like a desperate attempt to convince people that Meta didn’t mean what it said regarding its own hate speech policy.

The tweet was posted after Reuters reported that Russian prosecutors asked a court to designate Meta Platforms as an “extremist organization,” and the communications regulator said it would restrict access to Meta’s Instagram starting March 14. (Russia had previously blocked Facebook).