Subscribe to NordVPN through our affiliate link and save up to 74% on 2-year plans. To make it even better, you also get three extra months of use for free.
Many children dream of having an interactive dollor robot that is able to say more than a forged “I love you” when they hug it. However, as you know, the Internet of Things is a dangerous world and a lot has changed in the toys industry over the last few years. Everything is digitalized, smart and, most of all, connected. Barbie is no longer a doll who needs its clothes changed and hair brushed. Kids can now speak to their toys, possibly sharing quite important and personal information or simply their play preferences. This toy-diary of sort is on the table today, as the new smart toys currently available are equipped with speakers and microphones, making them capable of maintaining a basic conversation with the child. Therefore, dolls are not only able to talk to kids but they patiently listen to everything said in its environment, which can lead to dangerous consequences.
Regardless of its nature and origin, every toy capable of recording sound should be treated with caution. Not long ago a troubling case saw the light of the day, where they found proof that companies are using trackers on children in order to keep an eye on them. This resulted in some major corporations being fined after storing cookies for targeted advertising purposes in the future. The current situation is exactly the same, only it involves your kids’ direct speech.
Companies defend themselves by stating that the collected information is used to improve the quality of the toy’s ability by maintaining a fluent conversation with the child. They also reason that collecting audio is necessary to develop a toy capable of learning and growing along with the children. However, the information can be sold to third-party companies for targeted advertising purposes and, if any hacker ends up getting access to it, it can also be used to get inside your network and possibly access other smart devices, like your alarm system.¬
Many of these toys use Wi-Fi connections and because they’re toys, you’ll probably want to protect them with a strong password. This only makes a hacker’s job easier though. All they need to do is gain access to the smart doll, and after that the network system is cracked open for them.
Furthermore, they can make the doll say whatever they want to the children, as evidenced by a benevolent hacker who exposed the security flaws in My Friend Cayla, making the cute doll to swear. All it took to break this particular doll was an unprotected Bluetooth connection and since there was no encryption, it was easy to turn Cayla nasty. Companies tend to learn and overcome such flaws by making their toys safer for children. However, according to the same hacker, in this particular case “all they did was put one more step in the process of getting it to swear for us”, which only shows how careful you must be when choosing their new interactive best friend.
Best Reviews may receive compensation for its content through paid collaborations and/or affiliate links. Learn more about how we sustain our work and review products.
©2012-2024 Best Reviews, a clovio brand –
All rights
reserved
Privacy
policy
·
Cookie
policy
·
Terms
of use
·
Partnerships
· Contact
us