Watching from the cot: are smart toys and baby products worth it for parents?

0

We’re increasingly littering our homes with smart devices from TVs and fridges to home assistants, known broadly as “the internet of things”. The internet of things now extends to devices aimed at new parents, marketed as making parenting easier, and babies safer.

These include the types of products you’d expect (wifi-enabled baby monitors) and a whole range of more surprising objects (remote-operated white noise machines; smart cots that soothe babies to sleep; socks that monitor a baby’s heart rate and oxygen levels; smart toys that get to know their child owner). There are even surveillance systems that read the facial expressions, sounds and movements of babies, with the promise of alerting parents to potential dangers lurking in their little one’s cot.

Many baby monitoring devices work by using facial recognition technology, designed to pick up changes in a child’s expression. For instance, if the baby is crying or in distress. Some devices may also record happier moments, like a child laughing or smiling, and store them in the manufacturer’s cloud server.

While this is great in theory – a baby is in danger, a parent in another room can be alerted to act, or conversely, a joyful moment that may otherwise be missed is captured and saved – in practice the outcomes can be more perverse.

Invasive by design

Smart baby monitors use artificial intelligence (AI) systems to recognise a baby’s activity. These AI systems have been trained with databases of baby faces and cries, and the bigger the database, the better these AI systems work.

The Washington Post reports that many smart monitors will feed original footage they collect back into the AI systems that power them, improving the product’s capabilities. So, essentially, a family that buys a smart baby monitor is not just the customer; they are part of the product too.

Surveilling and collecting data from intimate household spaces is what makes these devices function as promised. Samantha Floreani of Digital Rights Watch says: “Many of these devices are data-extractive and invasive by design, without adequate privacy or security protections.”

The data can be used in other ways too. “It’s also about who they might sell that data to, how it might be combined with other datasets, and what happens if that company has a data breach,” she says.

Meanwhile, the American Association of Paediatrics “does not recommend using video or direct-to-consumer pulse oximetry monitors [such as smart socks and smart vests] as a strategy to reduce the risk of a sleep-related death” and flags overall concerns about these products’ accuracy and reliability.

Still, the global baby monitoring market is forecasted to grow to $1.63bn by 2025, and the smart toy market expected to reach $18bn by 2023. In Australia, smart monitors are increasingly prevalent and range from $50-200, while other, higher-tech devices can be many times that amount.

So not only are parents paying a premium for products that aren’t proven to have health or safety benefits for babies, they’re having data harvested from the most intimate parts of their lives when they do so.

Data to last a lifetime

There are very real concerns about not only what companies are doing with data today, but what might happen to that data in the future. The Office of the Victorian Information Commissioner notes that companies often keep data collected from smart home devices in perpetuity, “in case” it becomes useful at a later date

Considering baby monitoring devices begin storing data about a child from birth, and in Australia there are no current legal or regulatory provisions for personal right to erasure, or how long a company can store data, or what data can be kept, it is possible data captured by a baby monitor will be knocking around somewhere for the rest of a child’s life, with unknowable consequences.

But some consequences are knowable: the most obvious being future manipulation by advertisers. “The data that a single device collects might seem benign on its own, but when you combine this with other devices and the data that they collect, it can … paint a very clear picture of your life, habits, relationships and behaviours,” Floreani says.

Over the course of a child’s lifetime, that picture can give advertisers an insurmountable advantage, translating into the power to manipulate preferences and behaviours, ultimately undermining personal choice.

The profiles built from data collected from the cot, and throughout a child’s life, may also have impacts on their social and economic participation. The World Economic Forum warns that the on-selling of data to third parties, and ongoing profiling, could result in discrimination later in life – for instance, when applying for jobs or bank loans, all based on past “actions conducted in the privacy of the family’s home”.

Babies and children cannot provide meaningful consent to the privacy notices that come with products, or to being surveilled. Inconsistency across the board when it comes to privacy notices and rules also makes it difficult for parents to know exactly what they have signed up for.

All of the issues above are in play when the data is being used legally. Following the high-profile data breaches at Medibank, MyDeal and Optus, there are no assurances the information these devices collect won’t fall into the hands of malicious third parties.

The devices can also be directly hacked. Last year Wired reported that millions of web cameras and baby monitor feeds were vulnerable to hackers, due to software used in more than 83m devices (rather than the products themselves). The software had weak security protections, which could allow “an attacker [to] watch video feeds in real time, potentially viewing sensitive security footage or peeking inside a baby’s crib”.

Floreani notes that just one poorly protected smart device in the home can be a weak link. “If the security is weak, it could act as a gateway for hackers to access other devices on your network,” she says.

For another cautionary tale, look to My Friend Cayla. The early smart doll used facial and voice recognition to function, but was accessible to anyone within 30 feet (nine metres) of the toy if they had downloaded the app that controlled it – meaning anyone nearby could listen in to the user through the toy. Following exposure of this security flaw, the doll was declared illegal in many countries.

But Floreani is careful to point out that the responsibility for smart devices in the home isn’t a personal one. “While I think we should always think critically about the kinds of digital technologies we invite into our homes, we also need stronger regulations in place to ensure that devices are meeting security standards and that companies are respecting our privacy.

“Individuals shouldn’t have to go to great lengths or opt-out of using devices altogether to protect their privacy,” she says.

Ultimately, baby monitoring devices prey on the fears and insecurities of parents, amplifying and using those fears to sell products. But the companies that develop and sell baby monitoring devices are far less likely to be concerned with a child’s privacy and security than the families that buy their wares.

  • Kat George is a writer and public policy professional. Her work focuses on access and inclusion, consumer and human rights, regulation and new technology. She is a non-executive director at Choice and Hope Street Youth and Family Services. All views expressed in her writing are her own

Stay connected with us on social media platform for instant update click here to join our  Twitter, & Facebook

We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.

For all the latest Lifestyle News Click Here 

Read original article here

Denial of responsibility! Rapidtelecast.com is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.
Leave a comment