Snapchat could face a fine of millions of pounds after the UK data watchdog issued it with a preliminary enforcement notice over the alleged failure to assess privacy risks its artificial intelligence chatbot may pose to users and particularly children, The Guardian reported.
The Information Commissioner’s Office (ICO) said it had provisionally found that the social media app’s owner failed to “adequately identify and assess the risks” to several million UK users of My AI, including among 13-17-year olds.
According to The Guardian, Snapchat has 21 million monthly active users in the UK and has proved to be particularly popular among younger demographics, with the market research company Insider Intelligence estimating that 48% of users are aged 24 or under. About 18% of UK users are aged 12 to 17.
“The provisional findings of our investigation suggest a worrying failure by Snap [the parent of Snapchat] to adequately identify and assess the privacy risks to children and other users before launching My AI,” said John Edwards, the information commissioner.
BBC reported the Information Commissioner’s Office (ICO) warned it could close down the My AI feature in the UK after a “preliminary investigation.”
The US company said it was “closely reviewing” the provisional findings.
Snapchat describes it as a “evolving feature” which is powered by ChatGPT, an online AI tool which users new technology to convincingly imitate realistic responses.
Snap, the parent company behind Snapchat, became the first social media platform to adopt an artificial intelligence-powered chat function earlier this year.
According to BBC, Snap said it would “work constructively” with the ICO after it issued a preliminary notice against the company, adding that it had carried out a “robust legal and privacy review” before the function went public.
The data watchdog stressed its findings are not final, and it has not concluded that the company breached any data protection laws.
At this stage, the notice is a signal to Snap to ensure My AI complies with data protection rules which include the Children’s Design Code.
Engadget reported that Information Commissioner John Edwards said the IPO’s provisional findings from its investigation indicated a “worrying failure by Snap to adequately identify and assess the privacy risks to children and other users” before rolling out My AI. The ICO noted that if Snap failed to sufficiently address its concerns, it may block the ChatGPT-powered chatbot in the UK.
“My AI went through a robust legal and privacy review process before being made publicly available,” a Snap spokesperson told Reuters. “We will continue to work constructively with the ICO to ensure they’re comfortable with our risk assessment procedures.”
According to Engadget, soon after Snap rolled out the chatbot, parents raised concerns about My AI, and not only over privacy considerations. “I don’t think I’m prepared to know how to teach my kid how to emotionally separate humans and machines when they essentially look the same from her point of view,” a mother of a 13-year-old told CNN in April. “I just think there is a really clear line [Snapchat] is crossing.”
In my opinion, I think parents should have the right to decide whether or not to allow their kids to use Snapchat. Parents who have concerns about their child talking with an AI can choose to take Snapchat away, and potentially give it back when they are older.