WhatsApp gets parent mode for children
WhatsApp accounts for 10- to 12-year-olds will have restricted functions. Strangers will no longer be able to message children.
(Image: DenPhotos(Shutterstock.com)
Meta is introducing child accounts for its messenger WhatsApp. These can be set up by parents. The target group is 10- to 12-year-olds. Stricter rules and fewer functions will apply to them.
With such a child account, you can only use the messenger for making calls and sending messages. This means there are no broadcast channels or status messages, which are clearly inspired by classic social networks. It is therefore not surprising that Meta is making this decision right now, as the use or prohibition of social media by children and adolescents is currently the subject of political discussion.
Among the restrictions is that strangers cannot simply message the children. Parents or guardians decide who can communicate with the child and which groups children can join.
Parents cannot read their children's messages. As is customary with WhatsApp, all messages are end-to-end encrypted. This means that Meta also has no insight. Messages can only be seen by senders and recipients.
Videos by heise
To set up the child mode, the child's smartphone and a parent's smartphone must be held next to each other during setup – this is how they can be linked. Later, this can also be done via the settings. On the child's smartphone, the settings are secured with a parent PIN. However, according to its statements, WhatsApp is only rolling out the function gradually, so it may take some time before it is actually available.
Operators must protect minors
A similar mode for children also exists for Instagram from Meta. There, parents can also restrict who can message their children and adolescents. A major concern online is so-called cyber-grooming. This is when adults deliberately message adolescents with abusive intentions. Content can also be specifically hidden or controlled on Instagram.
TikTok offers a “Companion Mode.” This also provides, for example, that parents can set times when children cannot use TikTok. Followers can also be controlled and blocked.
To protect children and adolescents, the Digital Services Act (DSA) also stipulates that special rules apply to them. These primarily include technical designs that have negative effects. So-called dark patterns are therefore prohibited – this refers to manipulative designs that encourage permanent use and addiction or other unwanted actions. The requirement is that platform operators must minimize all risks for minors. However, it is questionable how exactly this will look.
(emw)