New Instagram Rules: Too little, too late?
It may be too late and not enough, but it is a start. In the firing line for overlooking issues of child safety, Instagram has finally announced measures to tackle sweeping online freedom that has made children — as young as 8 years old — vulnerable and exposed. A few have also tripped over the fatal edge.
The Meta-owned platform says its new safety controls will stop children under 18 from receiving messages from strangers. Kids under 16 will also automatically move to ‘teen accounts,’ in a phased manner beginning with the UK, US, Canada and Australia.
Instagram’s new policy includes ‘Teen Accounts’ being switched to a private setting and direct messages (DMs) from anyone other than friends being filtered out. Sleep mode will be switched on automatically and teens on the app will be reminded after 60 minutes to log out.
It also says that parental permission will be required to change settings although 16 and 17-year-olds can opt out of the restrictive settings. But there is no happily ever after just yet, ‘sleep mode’ may sound good on paper but the tech giant is not unaware of the sleepless creatures it has unleashed. Moreover, for Gen Z, Instagram and Snapchat is their WhatsApp — a means of communication.
The Zuckerberg-owned platform also promises to clamp down on the critical issue of limiting sensitive and offensive content. This acceptance — and hopefully strict implementation — however leaves a bad taste. If the platform can promise to fix it now, the stonewalling in the past is tragic since children’s lives and their safety were knowingly compromised at the altar of profit.
Legal age on Instagram
Perhaps it is sensible to see how the latest actions unfold before handing out any certificates, especially on the issue of dodging the age limit. Instagram says it is putting more filters in place for age verification so that children can no longer fudge their records. It is easier said than done, age authentication in the past never derailed kids and experts say it may not even now. How else are children as young as 8 or 9 — legal age on the platform is 13 — selling skincare knowledge on IG?
Instagram also says it is ‘building technology to proactively find accounts that belong to teens and automatically place them in protected, age-appropriate settings,’ unfortunately, its word is not enough. This train has long left the ‘platform’ and Meta allowed it.
The company did so by shrugging off mental health concerns including self-harm and there have been instances where its algorithm even recommended graphic self-harm. Furthermore, its lack of content moderation exposed a generation to grooming and child predators. Cases like that of Sean ‘Diddy’ Combs are in the real world and even then, take years to come out, imagine the online world where anonymity allows for people and acts to disappear in the dark web.
Instead of safeguarding the vulnerable, Meta knowingly refused to shut down accounts — it reportedly received as many as a million complaints — belonging to children under the age of 13. Not only that, it also collected their personal information without parental consent. For years, it has looked the other way and ignored warnings that it was harming the mental health of teens, a whistle-blower accused CEO Mark Zuckerberg of being a part of this unholy arrangement.
Against unadulterated control
With the genie already out of the bottle, teens are shrugging off the incoming changes — industry watchers say Instagram will face a minimum profit reversal. Experts also fear that teens already exposed to a heady digital existence will be motivated to go around the restrictions.
With additional parental controls, Instagram has shifted the onus to families to take charge of their children’s social media habits. It says parents can restrict the app and filter content but doesn’t address how parents who are still ignorant of their child’s multiple accounts will be able to do so. It is not a win-win solution.
Yet, families are responsible for how they perceive social media for their children. To leave it all on Big Tech is a reboot of ‘Home Alone.’ If their teenager’s well-being matters, parents must take over the controls and keep a keen eye on red flags for as Molly Russell’s family will tell you, social media is not a game. Molly, 14, killed herself after accessing content about self-harm and suicide. Christopher Dawley, 17, shot himself with his phone still in his hand.
Countries like Australia are not taking any chances and say it remains a strong case for the government to act and move ahead with its plans to pass a law banning Australian teenagers under a certain age from using social media.
And, in the UK, as more schools line up to ban smartphones, there are stirrings of a global movement pushing back against unadulterated control of social media companies. Technology and profit at the expense of teenagers — the biggest clients on platforms like Snapchat, TikTok and Instagram — need accountability. So, are Instagram’s new rules trustworthy?
Big Tech enabled social media to become the Wild West. Even now, it is not about the well-being of children, after the backlash Meta’s back is to the wall as it faces lawsuits from 33 US states for deliberately exposing teens and tweens to risks. In the words of Gen Z, it is all sus.