24-01-2026 12:00:00 AM
The rise of online gaming platforms like Roblox and Fortnite has transformed how children spend their time, socialize, and entertain themselves. However, these immersive digital worlds come with significant risks that parents, experts, and society are increasingly grappling with. A leading cyber security expert outlined the evolution of gaming from simple multiplayer experiences like PUBG to more complex platforms like Roblox. He described Roblox as an "addiction framework disguised as a social network and a game," where children can design avatars, create their own games, engage in live voice chats, and make in-app purchases.
He emphasized the safety concerns, including exposure to violence, sexualized content through stickers and cartoons, and the psychological toll of unlimited dopamine hits. He warned that this addiction often leads to children secretly using parents' credit cards for purchases, resulting in financial exploitation. Beyond that, he highlighted grooming risks in multiplayer environments where strangers interact freely, blurring the lines between fun and danger.
A child and adolescent psychotherapist drawing from her experience working with children and families in Delhi, approached the issue from a psychological perspective. She cautioned against labeling excessive gaming as mere "addiction" or "internet gaming disorder," arguing that it strips away the broader context. She pointed out that children's brains are still developing and are bombarded with intense stimulation in short bursts, which can lead to overstimulation. However, she shifted the focus to societal factors, questioning what alternatives adults offer when devices are taken away. In urban settings like Delhi, she noted, parks are often inaccessible due to weather or misuse as parking lots, leaving screens as the most convenient option—not just for kids, but for busy parents
who use devices as "babysitters." She viewed gaming addiction as a symptom of deeper issues, like a lack of real-world engagement opportunities, rather than an isolated disorder.
A lawyer and former office bearer of Young FICCI Ladies Organization brought a personal touch to the debate as a mother navigating these challenges with her 10- and 12-year-old sons. She shared how she and her husband initially set strict rules against guns and video games, only to relent under their children's persistent arguments. The boys convinced them that fortnight wasn't as harmful as perceived, citing peer pressure as a "harmless" factor and emphasizing that they balanced gaming with sports, studies, and competitions.
She admitted her apprehension about this unfamiliar world, describing efforts to play alongside her kids and keep the console in a communal space for monitoring. She recounted a striking example from her son's birthday, where virtual "skins" and avatars were prized over physical gifts like T-shirts, illustrating how deeply entrenched these digital economies are in children's social lives. Despite her hands-on approach, she acknowledged the difficulty in fully understanding or controlling the content.
A former president of the Center for Digital Economy Policy added a regulatory and policy lens, comparing seemingly harmless cartoonish designs to the infamous social media addictions like the “Blue Whale Challenge” , which encouraged self-harm. He stressed that platforms like these are not mere toys but potent instruments where children interact with strangers, potentially leading to inappropriate conversations, grooming, or migration to unmonitored apps like Discord.
He criticized the reactive nature of platform moderation, noting that age verification is easily bypassed through black-market child accounts sold to predators. He argued for a broader regulatory framework to penalize platforms that prioritize engagement and monetization over safety, drawing parallels to how societies regulate food, medicine, and education for children. Referencing global examples, such as Australia's ban on social media for kids under 16 and Germany's smart phone restrictions for those under 15, he urged governments to step in where parental oversight falls short.
It was pointed out that predators exploit this anonymity, leading to grooming, inappropriate chats (sometimes migrating to Discord), and exposure to violence or sexualized material. In-app purchases feel like "fake" virtual money but result in real credit card bills, fueling dark monetization and financial exploitation of children. Some individuals reportedly earn significant income (e.g., thousands of dollars monthly) selling avatars and skins, largely to under-18 users using parents' cards. Age verification remains ineffective—bypassed via fake accounts sold on sites like eBay, or through forged documents
Experts called for stronger measures. Cyber security experts advocated government oversight, similar to film ratings or China's time limits on games (e.g., 30-60 minutes based on content). A transparent evaluation system—perhaps a committee rating top games—or community-driven parental rating projects (like those for movies/OTT) could help. Platforms' business models prioritize engagement and spending over safety, with moderation often reactive rather than preventive. Those from legal fraternity stressed balanced self-regulation within families, while acknowledging government layers are needed since parents can't monitor everything (e.g., hidden emoji meanings or chats).
Broader solutions include tighter age gating, time restrictions, and penalties for platforms failing safety standards. Countries like Australia have implemented social media bans for under-16s (effective late 2025, deactivating millions of accounts), while Germany and others debate smart phone/social media limits for younger children. The panel agreed: platforms aren't mere toys; they're potent tools that can steal time, expose vulnerabilities, and require collective responsibility—from present parenting to robust regulation—to ensure children are raised by families, not algorithms.