Earlier this month, TikTok announced that it will be introducing screen time limits for the accounts of users aged under 18. As concerns grow regarding social media addiction among children, the 60-minute time limit on screens may restore some level of control over how the technology is used by minors.

These limits, however, are more suggestions than firm sanctions, and may start conversations in households about children's phone usage.

Children under 13 will require a parent or guardian to type in a passcode to continue scrolling through their feed, however, those aged 13-17 will be offered the ability to set their own passcodes, meaning that the restrictions are advisory in nature.

If the teenager goes over 100 minutes, however, the app will force them to create their own passcode. Weekly updates from the app also provide users with insights into their viewing habits and how long they are spending viewing content.

Studies published in scientific outlets including the Journal of Social and Clinical Psychology suggest that 30 minutes may be the sweet spot for social media use, where users are able to stay connected with friends and family and view entertainment. This will obviously vary from person to person, and many studies indicate that overly-heavy social media use can lead to a plethora of problems, such as body self-esteem issues and depression.

By suggesting 100 minutes as a hard line where a passcode is required to be created by the user, TikTok is gradually clarifying objective standards as to what may constitute problematic social media habits in young people. Granted, these passcodes for older teenagers can obviously be bypassed, but they are a step in the right direction in encouraging children to use social media responsibly.

Parents who own their own TikTok accounts will also be able to link with their children's profiles, offering them additional controls and information. This includes how much time they are spending and how often the app is used.

Guardian settings can also set times to mute notifications and customize time limits for different days in the week. These analytical features have lots of potential, however, and there is still room for improvement.

Algorithms that could notify parents if the child has been viewing dangerous material by identifying key phrases and hashtags could soon be on the horizon. Social media can be a minefield for impressionable children, with many technology companies criticized for their inability to curb content encouraging eating disorders, violence, or self harm.

The difficulty in this is the sheer amount of content available on the internet, and while manual content moderators and AI algorithms can do their best to stem the tide, ultimately, offering information on their child's viewing habits could spread this protective power to parents too.

Ultimately, apps such as TikTok and Instagram will need to continue introducing more measures to improve online safety for children.

Lawmakers around the world are paying close attention to the effect social media has on the young, meaning that the development of new tools to improve its use will no doubt be on the horizon if big tech wishes to avoid additional legal regulation.

The World Internet Conference (WIC) was established as an international organization on July 12, 2022, headquartered in Beijing, China. It was jointly initiated by Global System for Mobile Communication Association (GSMA), National Computer Network Emergency Response Technical Team/Coordination Center of China (CNCERT), China Internet Network Information Center (CNNIC), Alibaba Group, Tencent, and Zhijiang Lab.