Would You Let AI Choose Your Child’s Babysitter?



Parents want to find a reliable, experienced, and compassionate person to babysit their child. Some parents find that person in a relative or a very close friend. Others will ask for recommendations from other parents that they know and trust. This system of vetting potential babysitters has been used for a very long time.

The Washington Post has an article about a company called Predictim. That same article was also posted on McCall.com.

Predictim offers an online service that uses “advanced artificial intelligence” to assess a babysitter’s personality. It scans through the potential babysitter’s Facebook, Twitter, and Instagram posts, and gives an automated “risk rating”.

According to the article, the “risk rating” can indicate the risk of the babysitter being a drug abuser. It also assesses the risk of the babysitter for bullying, harassment, being disrespectful, and having a bad attitude.

It does not gather any information about how long the person has been a babysitter. It doesn’t ask if the babysitter has a degree in Early Childhood Education, or Teaching. It doesn’t find out of the babysitter knows CPR, has worked with children who have special needs, or has worked in a daycare center.

The article says that price of a Predictim scan starts at $24.99. It requires a babysitter’s name, email address, and her consent to share broad access to her social media accounts. Babysitters who decline are told that “the interested parent will not be able to hire you until you complete this request.”

In my opinion, as a person who has a teaching degree and who has spent years working in daycare, the Predictim analysis is both dangerous and misleading. What does Predictim do with the data they collect from babysitter’s social media accounts? Will this data be shared with other employers in other fields? What if this data is leaked or stolen by nefarious people?

Many babysitters are teenagers, and I question the ethics of gathering personal data from people who are not adults. Does Predictim get permission from those people’s parents before grabbing information from the teenager’s social media account?

Another huge problem with using AI is that it tends to pick up the biases of whomever created it. Predictim’s AI could wind up excluding babysitters who are people of color, LGBT, of certain religious or ethnic backgrounds, or simply not photogenic enough in their Instagram posts.

Image by Pexels