When we think about issues relating to children and their technology usage, the first thing that generally comes to mind is the sheer amount of time they spend glued to their screens.
While worrying about their time spent is entirely reasonable, parents should focus on the sort of information and data their children might be unintentionally giving away and their kid’s online safety.
Social media makes it extremely easy for children—and adults alike—to give away personal information such as their full names, ages, phone numbers, and even addresses. Kids playing online games, too, might occasionally share information about themselves in chat groups, especially when playing group-style games.
In the U.S., the Children’s Online Privacy Protection Rule, or COPPA, imposes requirements for website operators and online service providers that are directed at children to abide by when collecting information about children under 13 years of age. The rule also covers sites or online services explicitly collecting personal information from children under 13.
While this rule has helped set better guidelines for protecting children, it’s not always easy to distinguish the ages of users online. Not every country can agree on an age group that defines “children.” The UN Convention on the Rights of the Child defines children as anyone under 18.
Secondly, not every technology company has age-assurance processes like age verification or self-declaration for new users that join the platform. It’s also relatively easy for children to bypass these verification methods by inputting false birth dates and information.
In early September 2022, Meta was fined almost USD 400 million for breaking the European Union’s law on data privacy for treating children’s data on Instagram. Ireland’s Data Protection Commission imposed one of the largest fines to date under the General Data Protection Regulation (GDPR), Europe’s data privacy law.
The four-year-old GDPR has long been criticized for being weakly enforced. Building upon the GDPR, lawmakers in the U.S. state of California enacted a law last week that requires online services to increase protections for children and would change how social media and gaming platforms treat minors. Britain, too, passed a similar law.
Until these laws fully come into action, parents could do a lot to drive the conversation about online privacy to protect their children.
To begin, parents could introduce their children to some of the dangers of giving away too much online information and walk them through real-life cases involving children who have done so. By doing so, parents can create an open space for their children to discuss issues they could potentially face online.
In addition to this, going through privacy settings on social media and online gaming platforms could do a lot to educate children about their options. For example, platforms like Instagram and TikTok allow users to remain private and restrict direct messages from unknown users. It will teach them transferable skills when interacting with other online platforms. It might also be worth looking at the sort of content children view on social media and their interactions with others on online platforms.
Ultimately, parents could learn a lot by researching and keeping up with the changes on social media and other online platforms.